7491 1727203957.56422: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-G1p executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 7491 1727203957.56905: Added group all to inventory 7491 1727203957.56907: Added group ungrouped to inventory 7491 1727203957.56911: Group all now contains ungrouped 7491 1727203957.56914: Examining possible inventory source: /tmp/network-M6W/inventory-5vW.yml 7491 1727203957.73490: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 7491 1727203957.73552: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 7491 1727203957.73579: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 7491 1727203957.73641: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 7491 1727203957.73718: Loaded config def from plugin (inventory/script) 7491 1727203957.73721: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 7491 1727203957.73761: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 7491 1727203957.74494: Loaded config def from plugin (inventory/yaml) 7491 1727203957.74497: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 7491 1727203957.74588: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 7491 1727203957.74997: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 7491 1727203957.75000: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 7491 1727203957.75004: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 7491 1727203957.75010: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 7491 1727203957.75015: Loading data from /tmp/network-M6W/inventory-5vW.yml 7491 1727203957.75085: /tmp/network-M6W/inventory-5vW.yml was not parsable by auto 7491 1727203957.75152: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 7491 1727203957.75196: Loading data from /tmp/network-M6W/inventory-5vW.yml 7491 1727203957.75280: group all already in inventory 7491 1727203957.75287: set inventory_file for managed-node1 7491 1727203957.75291: set inventory_dir for managed-node1 7491 1727203957.75292: Added host managed-node1 to inventory 7491 1727203957.75295: Added host managed-node1 to group all 7491 1727203957.75296: set ansible_host for managed-node1 7491 1727203957.75297: set ansible_ssh_extra_args for managed-node1 7491 1727203957.75300: set inventory_file for managed-node2 7491 1727203957.75303: set inventory_dir for managed-node2 7491 1727203957.75303: Added host managed-node2 to inventory 7491 1727203957.75305: Added host managed-node2 to group all 7491 1727203957.75306: set ansible_host for managed-node2 7491 1727203957.75306: set ansible_ssh_extra_args for managed-node2 7491 1727203957.75309: set inventory_file for managed-node3 7491 1727203957.75311: set inventory_dir for managed-node3 7491 1727203957.75312: Added host managed-node3 to inventory 7491 1727203957.75313: Added host managed-node3 to group all 7491 1727203957.75314: set ansible_host for managed-node3 7491 1727203957.75315: set ansible_ssh_extra_args for managed-node3 7491 1727203957.75317: Reconcile groups and hosts in inventory. 7491 1727203957.75321: Group ungrouped now contains managed-node1 7491 1727203957.75323: Group ungrouped now contains managed-node2 7491 1727203957.75324: Group ungrouped now contains managed-node3 7491 1727203957.75402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 7491 1727203957.75540: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 7491 1727203957.75592: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 7491 1727203957.75620: Loaded config def from plugin (vars/host_group_vars) 7491 1727203957.75622: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 7491 1727203957.75629: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 7491 1727203957.75637: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 7491 1727203957.75681: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 7491 1727203957.76048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203957.76145: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 7491 1727203957.76186: Loaded config def from plugin (connection/local) 7491 1727203957.76189: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 7491 1727203957.76818: Loaded config def from plugin (connection/paramiko_ssh) 7491 1727203957.76822: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 7491 1727203957.78835: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 7491 1727203957.78878: Loaded config def from plugin (connection/psrp) 7491 1727203957.78881: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 7491 1727203957.79704: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 7491 1727203957.79745: Loaded config def from plugin (connection/ssh) 7491 1727203957.79748: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 7491 1727203957.80098: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 7491 1727203957.80135: Loaded config def from plugin (connection/winrm) 7491 1727203957.80138: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 7491 1727203957.80173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 7491 1727203957.80236: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 7491 1727203957.80303: Loaded config def from plugin (shell/cmd) 7491 1727203957.80305: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 7491 1727203957.80329: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 7491 1727203957.80393: Loaded config def from plugin (shell/powershell) 7491 1727203957.80395: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 7491 1727203957.80444: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 7491 1727203957.80612: Loaded config def from plugin (shell/sh) 7491 1727203957.80614: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 7491 1727203957.80645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 7491 1727203957.80766: Loaded config def from plugin (become/runas) 7491 1727203957.80769: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 7491 1727203957.80985: Loaded config def from plugin (become/su) 7491 1727203957.80987: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 7491 1727203957.81141: Loaded config def from plugin (become/sudo) 7491 1727203957.81143: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 7491 1727203957.81176: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_auto_gateway_nm.yml 7491 1727203957.81510: in VariableManager get_vars() 7491 1727203957.81533: done with get_vars() 7491 1727203957.81667: trying /usr/local/lib/python3.12/site-packages/ansible/modules 7491 1727203957.85198: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 7491 1727203957.85283: in VariableManager get_vars() 7491 1727203957.85287: done with get_vars() 7491 1727203957.85289: variable 'playbook_dir' from source: magic vars 7491 1727203957.85290: variable 'ansible_playbook_python' from source: magic vars 7491 1727203957.85290: variable 'ansible_config_file' from source: magic vars 7491 1727203957.85291: variable 'groups' from source: magic vars 7491 1727203957.85292: variable 'omit' from source: magic vars 7491 1727203957.85292: variable 'ansible_version' from source: magic vars 7491 1727203957.85293: variable 'ansible_check_mode' from source: magic vars 7491 1727203957.85293: variable 'ansible_diff_mode' from source: magic vars 7491 1727203957.85294: variable 'ansible_forks' from source: magic vars 7491 1727203957.85294: variable 'ansible_inventory_sources' from source: magic vars 7491 1727203957.85295: variable 'ansible_skip_tags' from source: magic vars 7491 1727203957.85295: variable 'ansible_limit' from source: magic vars 7491 1727203957.85295: variable 'ansible_run_tags' from source: magic vars 7491 1727203957.85296: variable 'ansible_verbosity' from source: magic vars 7491 1727203957.85321: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml 7491 1727203957.86056: in VariableManager get_vars() 7491 1727203957.86072: done with get_vars() 7491 1727203957.86103: in VariableManager get_vars() 7491 1727203957.86115: done with get_vars() 7491 1727203957.86147: in VariableManager get_vars() 7491 1727203957.86158: done with get_vars() 7491 1727203957.86278: in VariableManager get_vars() 7491 1727203957.86290: done with get_vars() 7491 1727203957.86294: variable 'omit' from source: magic vars 7491 1727203957.86310: variable 'omit' from source: magic vars 7491 1727203957.86339: in VariableManager get_vars() 7491 1727203957.86348: done with get_vars() 7491 1727203957.86390: in VariableManager get_vars() 7491 1727203957.86402: done with get_vars() 7491 1727203957.86434: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 7491 1727203957.86680: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 7491 1727203957.86807: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 7491 1727203957.87397: in VariableManager get_vars() 7491 1727203957.87413: done with get_vars() 7491 1727203957.88133: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 7491 1727203957.88378: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7491 1727203957.89449: in VariableManager get_vars() 7491 1727203957.89461: done with get_vars() 7491 1727203957.89485: in VariableManager get_vars() 7491 1727203957.89505: done with get_vars() 7491 1727203957.89783: in VariableManager get_vars() 7491 1727203957.89794: done with get_vars() 7491 1727203957.89797: variable 'omit' from source: magic vars 7491 1727203957.89804: variable 'omit' from source: magic vars 7491 1727203957.89824: in VariableManager get_vars() 7491 1727203957.89833: done with get_vars() 7491 1727203957.89846: in VariableManager get_vars() 7491 1727203957.89855: done with get_vars() 7491 1727203957.89878: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 7491 1727203957.89939: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 7491 1727203957.89985: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 7491 1727203957.90402: in VariableManager get_vars() 7491 1727203957.90427: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7491 1727203957.92712: in VariableManager get_vars() 7491 1727203957.92741: done with get_vars() 7491 1727203957.92933: in VariableManager get_vars() 7491 1727203957.92989: done with get_vars() 7491 1727203957.93052: in VariableManager get_vars() 7491 1727203957.93095: done with get_vars() 7491 1727203957.93102: variable 'omit' from source: magic vars 7491 1727203957.93110: variable 'omit' from source: magic vars 7491 1727203957.93149: in VariableManager get_vars() 7491 1727203957.93160: done with get_vars() 7491 1727203957.93177: in VariableManager get_vars() 7491 1727203957.93189: done with get_vars() 7491 1727203957.93210: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 7491 1727203957.93290: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 7491 1727203957.93337: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 7491 1727203957.93566: in VariableManager get_vars() 7491 1727203957.93581: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7491 1727203957.95817: in VariableManager get_vars() 7491 1727203957.95840: done with get_vars() 7491 1727203957.95883: in VariableManager get_vars() 7491 1727203957.95903: done with get_vars() 7491 1727203957.96846: in VariableManager get_vars() 7491 1727203957.96868: done with get_vars() 7491 1727203957.96888: variable 'omit' from source: magic vars 7491 1727203957.96899: variable 'omit' from source: magic vars 7491 1727203957.96934: in VariableManager get_vars() 7491 1727203957.96954: done with get_vars() 7491 1727203957.96994: in VariableManager get_vars() 7491 1727203957.97037: done with get_vars() 7491 1727203957.97386: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 7491 1727203957.97509: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 7491 1727203957.97640: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 7491 1727203958.02640: in VariableManager get_vars() 7491 1727203958.02691: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7491 1727203958.05549: in VariableManager get_vars() 7491 1727203958.05582: done with get_vars() 7491 1727203958.05623: in VariableManager get_vars() 7491 1727203958.05646: done with get_vars() 7491 1727203958.05703: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 7491 1727203958.05720: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 7491 1727203958.06015: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 7491 1727203958.06198: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 7491 1727203958.06201: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-G1p/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 7491 1727203958.06236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 7491 1727203958.06261: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 7491 1727203958.06449: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 7491 1727203958.06509: Loaded config def from plugin (callback/default) 7491 1727203958.06511: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 7491 1727203958.07367: Loaded config def from plugin (callback/junit) 7491 1727203958.07370: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 7491 1727203958.07410: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 7491 1727203958.07472: Loaded config def from plugin (callback/minimal) 7491 1727203958.07475: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 7491 1727203958.07511: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 7491 1727203958.07567: Loaded config def from plugin (callback/tree) 7491 1727203958.07569: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 7491 1727203958.07683: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 7491 1727203958.07686: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-G1p/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_auto_gateway_nm.yml ******************************************** 2 plays in /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_auto_gateway_nm.yml 7491 1727203958.07713: in VariableManager get_vars() 7491 1727203958.07725: done with get_vars() 7491 1727203958.07731: in VariableManager get_vars() 7491 1727203958.07739: done with get_vars() 7491 1727203958.07742: variable 'omit' from source: magic vars 7491 1727203958.07782: in VariableManager get_vars() 7491 1727203958.07796: done with get_vars() 7491 1727203958.07817: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_auto_gateway.yml' with nm as provider] ***** 7491 1727203958.08379: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 7491 1727203958.08454: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 7491 1727203958.08501: getting the remaining hosts for this loop 7491 1727203958.08503: done getting the remaining hosts for this loop 7491 1727203958.08506: getting the next task for host managed-node3 7491 1727203958.08510: done getting next task for host managed-node3 7491 1727203958.08511: ^ task is: TASK: Gathering Facts 7491 1727203958.08513: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203958.08515: getting variables 7491 1727203958.08523: in VariableManager get_vars() 7491 1727203958.08533: Calling all_inventory to load vars for managed-node3 7491 1727203958.08536: Calling groups_inventory to load vars for managed-node3 7491 1727203958.08538: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203958.08551: Calling all_plugins_play to load vars for managed-node3 7491 1727203958.08562: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203958.08568: Calling groups_plugins_play to load vars for managed-node3 7491 1727203958.08623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203958.08677: done with get_vars() 7491 1727203958.08684: done getting variables 7491 1727203958.08759: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_auto_gateway_nm.yml:6 Tuesday 24 September 2024 14:52:38 -0400 (0:00:00.012) 0:00:00.012 ***** 7491 1727203958.08835: entering _queue_task() for managed-node3/gather_facts 7491 1727203958.08837: Creating lock for gather_facts 7491 1727203958.09177: worker is 1 (out of 1 available) 7491 1727203958.09189: exiting _queue_task() for managed-node3/gather_facts 7491 1727203958.09203: done queuing things up, now waiting for results queue to drain 7491 1727203958.09205: waiting for pending results... 7491 1727203958.09471: running TaskExecutor() for managed-node3/TASK: Gathering Facts 7491 1727203958.09580: in run() - task 0affcd87-79f5-0a4a-ad01-000000000155 7491 1727203958.09670: variable 'ansible_search_path' from source: unknown 7491 1727203958.09720: calling self._execute() 7491 1727203958.09846: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203958.09889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203958.09932: variable 'omit' from source: magic vars 7491 1727203958.10049: variable 'omit' from source: magic vars 7491 1727203958.10085: variable 'omit' from source: magic vars 7491 1727203958.10135: variable 'omit' from source: magic vars 7491 1727203958.10259: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203958.10307: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203958.10340: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203958.10380: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203958.10398: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203958.10442: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203958.10451: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203958.10458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203958.10580: Set connection var ansible_timeout to 10 7491 1727203958.10600: Set connection var ansible_pipelining to False 7491 1727203958.10612: Set connection var ansible_shell_type to sh 7491 1727203958.10625: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203958.10639: Set connection var ansible_shell_executable to /bin/sh 7491 1727203958.10651: Set connection var ansible_connection to ssh 7491 1727203958.10680: variable 'ansible_shell_executable' from source: unknown 7491 1727203958.10689: variable 'ansible_connection' from source: unknown 7491 1727203958.10704: variable 'ansible_module_compression' from source: unknown 7491 1727203958.10711: variable 'ansible_shell_type' from source: unknown 7491 1727203958.10721: variable 'ansible_shell_executable' from source: unknown 7491 1727203958.10729: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203958.10737: variable 'ansible_pipelining' from source: unknown 7491 1727203958.10744: variable 'ansible_timeout' from source: unknown 7491 1727203958.10756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203958.11012: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203958.11038: variable 'omit' from source: magic vars 7491 1727203958.11048: starting attempt loop 7491 1727203958.11053: running the handler 7491 1727203958.11077: variable 'ansible_facts' from source: unknown 7491 1727203958.11102: _low_level_execute_command(): starting 7491 1727203958.11115: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203958.11961: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203958.11982: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203958.11999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203958.12030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203958.12102: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203958.12118: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203958.12167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203958.12192: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203958.12206: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203958.12220: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203958.12276: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203958.12291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203958.12310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203958.12325: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203958.12337: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203958.12351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203958.12438: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203958.12460: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203958.12486: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203958.12560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203958.14176: stdout chunk (state=3): >>>/root <<< 7491 1727203958.14374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203958.14377: stdout chunk (state=3): >>><<< 7491 1727203958.14380: stderr chunk (state=3): >>><<< 7491 1727203958.14511: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203958.14518: _low_level_execute_command(): starting 7491 1727203958.14521: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203958.1440659-7548-112931182985139 `" && echo ansible-tmp-1727203958.1440659-7548-112931182985139="` echo /root/.ansible/tmp/ansible-tmp-1727203958.1440659-7548-112931182985139 `" ) && sleep 0' 7491 1727203958.15386: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203958.15397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203958.15434: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203958.15437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203958.15569: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203958.15582: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203958.15646: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203958.17459: stdout chunk (state=3): >>>ansible-tmp-1727203958.1440659-7548-112931182985139=/root/.ansible/tmp/ansible-tmp-1727203958.1440659-7548-112931182985139 <<< 7491 1727203958.17566: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203958.17643: stderr chunk (state=3): >>><<< 7491 1727203958.17646: stdout chunk (state=3): >>><<< 7491 1727203958.17876: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203958.1440659-7548-112931182985139=/root/.ansible/tmp/ansible-tmp-1727203958.1440659-7548-112931182985139 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203958.17880: variable 'ansible_module_compression' from source: unknown 7491 1727203958.17883: ANSIBALLZ: Using generic lock for ansible.legacy.setup 7491 1727203958.17885: ANSIBALLZ: Acquiring lock 7491 1727203958.17887: ANSIBALLZ: Lock acquired: 139674606106048 7491 1727203958.17889: ANSIBALLZ: Creating module 7491 1727203958.61822: ANSIBALLZ: Writing module into payload 7491 1727203958.62027: ANSIBALLZ: Writing module 7491 1727203958.62063: ANSIBALLZ: Renaming module 7491 1727203958.62080: ANSIBALLZ: Done creating module 7491 1727203958.62129: variable 'ansible_facts' from source: unknown 7491 1727203958.62141: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203958.62157: _low_level_execute_command(): starting 7491 1727203958.62170: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 7491 1727203958.62910: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203958.62928: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203958.62951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203958.62975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203958.63019: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203958.63032: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203958.63046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203958.63076: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203958.63089: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203958.63100: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203958.63111: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203958.63128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203958.63144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203958.63155: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203958.63173: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203958.63191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203958.63265: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203958.63300: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203958.63322: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203958.63413: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203958.65021: stdout chunk (state=3): >>>PLATFORM <<< 7491 1727203958.65107: stdout chunk (state=3): >>>Linux <<< 7491 1727203958.65124: stdout chunk (state=3): >>>FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 7491 1727203958.65268: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203958.65368: stderr chunk (state=3): >>><<< 7491 1727203958.65381: stdout chunk (state=3): >>><<< 7491 1727203958.65476: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203958.65487 [managed-node3]: found interpreters: ['/usr/bin/python3.9', '/usr/bin/python3', '/usr/bin/python3'] 7491 1727203958.65490: _low_level_execute_command(): starting 7491 1727203958.65570: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 && sleep 0' 7491 1727203958.65648: Sending initial data 7491 1727203958.65651: Sent initial data (1181 bytes) 7491 1727203958.66255: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203958.66274: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203958.66289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203958.66306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203958.66359: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203958.66376: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203958.66390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203958.66408: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203958.66422: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203958.66433: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203958.66446: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203958.66466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203958.66486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203958.66498: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203958.66508: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203958.66526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203958.66614: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203958.66640: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203958.66655: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203958.66733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203958.70461: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 7491 1727203958.70837: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203958.70940: stderr chunk (state=3): >>><<< 7491 1727203958.70951: stdout chunk (state=3): >>><<< 7491 1727203958.71276: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203958.71279: variable 'ansible_facts' from source: unknown 7491 1727203958.71282: variable 'ansible_facts' from source: unknown 7491 1727203958.71284: variable 'ansible_module_compression' from source: unknown 7491 1727203958.71286: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 7491 1727203958.71289: variable 'ansible_facts' from source: unknown 7491 1727203958.71314: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203958.1440659-7548-112931182985139/AnsiballZ_setup.py 7491 1727203958.71501: Sending initial data 7491 1727203958.71505: Sent initial data (152 bytes) 7491 1727203958.72597: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203958.72614: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203958.72633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203958.72652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203958.72700: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203958.72722: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203958.72737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203958.72756: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203958.72772: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203958.72784: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203958.72798: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203958.72822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203958.72842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203958.72855: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203958.72868: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203958.72902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203958.72989: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203958.73013: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203958.73042: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203958.73115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203958.74830: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203958.74872: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203958.74939: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmphe8ozwrn /root/.ansible/tmp/ansible-tmp-1727203958.1440659-7548-112931182985139/AnsiballZ_setup.py <<< 7491 1727203958.74953: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203958.78161: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203958.78571: stderr chunk (state=3): >>><<< 7491 1727203958.78577: stdout chunk (state=3): >>><<< 7491 1727203958.78580: done transferring module to remote 7491 1727203958.78582: _low_level_execute_command(): starting 7491 1727203958.78585: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203958.1440659-7548-112931182985139/ /root/.ansible/tmp/ansible-tmp-1727203958.1440659-7548-112931182985139/AnsiballZ_setup.py && sleep 0' 7491 1727203958.80318: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203958.80332: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203958.80343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203958.80355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203958.80401: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203958.80408: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203958.80419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203958.80436: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203958.80444: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203958.80454: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203958.80457: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203958.80467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203958.80583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203958.80590: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203958.80598: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203958.80608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203958.80682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203958.80697: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203958.80707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203958.80933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203958.82699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203958.82703: stdout chunk (state=3): >>><<< 7491 1727203958.82708: stderr chunk (state=3): >>><<< 7491 1727203958.82729: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203958.82732: _low_level_execute_command(): starting 7491 1727203958.82736: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203958.1440659-7548-112931182985139/AnsiballZ_setup.py && sleep 0' 7491 1727203958.83930: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203958.84886: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203958.84897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203958.84913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203958.84961: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203958.84966: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203958.84978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203958.84992: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203958.84999: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203958.85006: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203958.85014: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203958.85034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203958.85037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203958.85046: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203958.85054: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203958.85061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203958.85150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203958.85182: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203958.85187: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203958.85233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203958.87149: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # <<< 7491 1727203958.87153: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 7491 1727203958.87204: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 7491 1727203958.87230: stdout chunk (state=3): >>>import 'posix' # <<< 7491 1727203958.87269: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 7491 1727203958.87310: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 7491 1727203958.87354: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 7491 1727203958.87400: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 7491 1727203958.87425: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241ab1edc0> <<< 7491 1727203958.87494: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' <<< 7491 1727203958.87528: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241aac33a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241ab1eb20> <<< 7491 1727203958.87548: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 7491 1727203958.87569: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241ab1eac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 7491 1727203958.87598: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241aac3490> <<< 7491 1727203958.87635: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 7491 1727203958.87648: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # <<< 7491 1727203958.87661: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241aac3940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241aac3670> <<< 7491 1727203958.87694: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 7491 1727203958.87696: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 7491 1727203958.87721: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 7491 1727203958.87743: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 7491 1727203958.87784: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 7491 1727203958.87806: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241aa7a190> <<< 7491 1727203958.87827: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 7491 1727203958.87904: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241aa7a220> <<< 7491 1727203958.87930: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 7491 1727203958.87968: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241aa9d850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241aa7a940> <<< 7491 1727203958.88023: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241aadb880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py <<< 7491 1727203958.88027: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241aa73d90> <<< 7491 1727203958.88098: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 7491 1727203958.88101: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241aa9dd90> <<< 7491 1727203958.88139: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241aac3970> <<< 7491 1727203958.88162: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 7491 1727203958.88493: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 7491 1727203958.88557: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 7491 1727203958.88561: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 7491 1727203958.88603: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 7491 1727203958.88606: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 7491 1727203958.88621: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a7d2f10> <<< 7491 1727203958.88669: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a7d90a0> <<< 7491 1727203958.88768: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 7491 1727203958.88796: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a7cc5b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a7d36a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a7d23d0> <<< 7491 1727203958.88856: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 7491 1727203958.88884: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 7491 1727203958.88906: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 7491 1727203958.88995: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 7491 1727203958.89022: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f241a690e50> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a690940> import 'itertools' # <<< 7491 1727203958.89048: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a690f40> <<< 7491 1727203958.89078: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 7491 1727203958.89125: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a690d90> <<< 7491 1727203958.89137: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a6a1100> import '_collections' # <<< 7491 1727203958.89179: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a7aedc0> import '_functools' # <<< 7491 1727203958.89208: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a7a76a0> <<< 7491 1727203958.89256: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py <<< 7491 1727203958.89279: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a7ba700> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a7daeb0> <<< 7491 1727203958.89295: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 7491 1727203958.89346: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f241a6a1d00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a7ae2e0> <<< 7491 1727203958.89373: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f241a7ba310> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a7e0a60> <<< 7491 1727203958.89403: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 7491 1727203958.89445: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 7491 1727203958.89498: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a6a1ee0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a6a1e20> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a6a1d90> <<< 7491 1727203958.89563: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 7491 1727203958.89597: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 7491 1727203958.89645: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a674400> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 7491 1727203958.89685: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a6744f0> <<< 7491 1727203958.90013: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a6a9f70> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a6a3ac0> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a6a3490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a5c2250> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a65f550> <<< 7491 1727203958.90054: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a6a3f40> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a7e00d0> <<< 7491 1727203958.90080: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 7491 1727203958.90112: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a5d4b80> import 'errno' # <<< 7491 1727203958.90169: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f241a5d4eb0> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 7491 1727203958.90199: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py <<< 7491 1727203958.90220: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a5e57c0> <<< 7491 1727203958.90231: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 7491 1727203958.90261: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 7491 1727203958.90285: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a5e5d00> <<< 7491 1727203958.90324: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f241a57f430> <<< 7491 1727203958.90347: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a5d4fa0> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 7491 1727203958.90407: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f241a58f310> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a5e5640> <<< 7491 1727203958.90430: stdout chunk (state=3): >>>import 'pwd' # <<< 7491 1727203958.90444: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f241a58f3d0> <<< 7491 1727203958.90506: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a6a1a60> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 7491 1727203958.90531: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 7491 1727203958.90547: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 7491 1727203958.90588: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f241a5ab730> <<< 7491 1727203958.90616: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f241a5aba00> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a5ab7f0> <<< 7491 1727203958.90637: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f241a5ab8e0> <<< 7491 1727203958.90670: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 7491 1727203958.90847: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f241a5abd30> <<< 7491 1727203958.90896: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f241a5b5280> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a5ab970> <<< 7491 1727203958.90925: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a59eac0> <<< 7491 1727203958.90945: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a6a1640> <<< 7491 1727203958.90956: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 7491 1727203958.91007: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 7491 1727203958.91038: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a5abb20> <<< 7491 1727203958.91180: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 7491 1727203958.91192: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f2419fe6700> <<< 7491 1727203958.91420: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 7491 1727203958.91514: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203958.91577: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py <<< 7491 1727203958.91585: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7491 1727203958.91601: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available <<< 7491 1727203958.92773: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203958.93763: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f25850> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 7491 1727203958.93825: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 7491 1727203958.93843: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419f25160> <<< 7491 1727203958.93892: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f25280> <<< 7491 1727203958.93930: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f25fa0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 7491 1727203958.93998: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f254f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f25dc0> <<< 7491 1727203958.94002: stdout chunk (state=3): >>>import 'atexit' # <<< 7491 1727203958.94055: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419f25580> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 7491 1727203958.94137: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 7491 1727203958.94188: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f25100> <<< 7491 1727203958.94192: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 7491 1727203958.94313: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419efa0a0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419dff370> <<< 7491 1727203958.94339: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419dff070> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 7491 1727203958.94412: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419dffcd0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f0ddc0> <<< 7491 1727203958.94570: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f0d3a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 7491 1727203958.94833: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f0df40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f5af40> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f27d60> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f27430> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419ed8af0> <<< 7491 1727203958.94846: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419f27550> <<< 7491 1727203958.94883: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f27580> <<< 7491 1727203958.94927: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 7491 1727203958.94930: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 7491 1727203958.94963: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 7491 1727203958.95068: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419e6dfa0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f6c280> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 7491 1727203958.95076: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 7491 1727203958.95125: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419e6a820> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f6c400> <<< 7491 1727203958.95158: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 7491 1727203958.95186: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 7491 1727203958.95227: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' <<< 7491 1727203958.95230: stdout chunk (state=3): >>>import '_string' # <<< 7491 1727203958.95282: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f6cc40> <<< 7491 1727203958.95409: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419e6a7c0> <<< 7491 1727203958.95505: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419f051c0> <<< 7491 1727203958.95534: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419f6c9d0> <<< 7491 1727203958.95581: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419f6c550> <<< 7491 1727203958.95620: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f65940> <<< 7491 1727203958.95641: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 7491 1727203958.95694: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419e5f910> <<< 7491 1727203958.95880: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419e7cdc0> <<< 7491 1727203958.95883: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419e69550> <<< 7491 1727203958.95942: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419e5feb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419e69970> <<< 7491 1727203958.95962: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available <<< 7491 1727203958.96046: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203958.96138: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203958.96141: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py <<< 7491 1727203958.96179: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available <<< 7491 1727203958.96278: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203958.96367: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203958.96813: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203958.97285: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 7491 1727203958.97332: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 7491 1727203958.97387: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419ea57f0> <<< 7491 1727203958.97459: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419eaa8b0> <<< 7491 1727203958.97469: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24199fb940> <<< 7491 1727203958.97524: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available <<< 7491 1727203958.97551: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 7491 1727203958.97678: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203958.97813: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 7491 1727203958.97847: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419ee3730> # zipimport: zlib available <<< 7491 1727203958.98239: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203958.98600: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203958.98653: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203958.98734: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py <<< 7491 1727203958.98737: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203958.98755: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203958.98803: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py <<< 7491 1727203958.98806: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203958.98859: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203958.98965: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py <<< 7491 1727203958.98978: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available <<< 7491 1727203958.99012: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203958.99063: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 7491 1727203958.99068: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203958.99230: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203958.99433: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 7491 1727203958.99460: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 7491 1727203958.99465: stdout chunk (state=3): >>>import '_ast' # <<< 7491 1727203958.99537: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f282e0> <<< 7491 1727203958.99540: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203958.99601: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203958.99673: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py <<< 7491 1727203958.99677: stdout chunk (state=3): >>>import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 7491 1727203958.99701: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203958.99734: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203958.99769: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available <<< 7491 1727203958.99817: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203958.99935: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.00003: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 7491 1727203959.00066: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 7491 1727203959.00763: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419e9c880> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419877550> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419ead910> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419ef7970> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419ee1850> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 7491 1727203959.01025: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 7491 1727203959.01030: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.01042: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7491 1727203959.01113: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7491 1727203959.01231: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available <<< 7491 1727203959.01294: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.01306: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.01348: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 7491 1727203959.01484: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.01630: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.01682: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.01710: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 7491 1727203959.01741: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py <<< 7491 1727203959.01758: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 7491 1727203959.01776: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 7491 1727203959.01809: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241977cc70> <<< 7491 1727203959.01829: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 7491 1727203959.01847: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 7491 1727203959.01902: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 7491 1727203959.01922: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24199dca30> <<< 7491 1727203959.01959: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24199dc9a0> <<< 7491 1727203959.02039: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419a28b20> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419a28550> <<< 7491 1727203959.02067: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419a102e0> <<< 7491 1727203959.02085: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419a10970> <<< 7491 1727203959.02106: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 7491 1727203959.02144: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py <<< 7491 1727203959.02170: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 7491 1727203959.02188: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24199c12b0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24199c1a00> <<< 7491 1727203959.02215: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 7491 1727203959.02257: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24199c1940> <<< 7491 1727203959.02269: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 7491 1727203959.02290: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 7491 1727203959.02315: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24197dd0d0> <<< 7491 1727203959.02721: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419e993a0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419a10670> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available <<< 7491 1727203959.02758: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available <<< 7491 1727203959.02785: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.02834: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py <<< 7491 1727203959.02839: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.02874: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.02923: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py <<< 7491 1727203959.02927: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.02982: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.03024: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.03079: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.03125: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py <<< 7491 1727203959.03139: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.03519: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.03887: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available <<< 7491 1727203959.03935: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.03986: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.04006: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.04056: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 7491 1727203959.04061: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.04118: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available <<< 7491 1727203959.04169: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.04237: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available <<< 7491 1727203959.04253: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.04285: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available <<< 7491 1727203959.04313: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.04347: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available <<< 7491 1727203959.04414: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.04490: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 7491 1727203959.04524: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24196cceb0> <<< 7491 1727203959.04537: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 7491 1727203959.04561: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 7491 1727203959.04734: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24196cc9d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available <<< 7491 1727203959.04782: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.04836: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available <<< 7491 1727203959.04914: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.05008: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py <<< 7491 1727203959.05011: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.05057: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.05129: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available <<< 7491 1727203959.05171: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.05226: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 7491 1727203959.05240: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 7491 1727203959.05379: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419738bb0> <<< 7491 1727203959.05625: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24196f5a60> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available <<< 7491 1727203959.05680: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.05739: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available <<< 7491 1727203959.05798: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.05881: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.05966: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.06409: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f241973f040> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241973f6d0> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available <<< 7491 1727203959.06412: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.06455: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py <<< 7491 1727203959.06458: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.06582: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.07008: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available <<< 7491 1727203959.07037: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.07054: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.07172: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.07289: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py <<< 7491 1727203959.07305: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.07406: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.07510: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available <<< 7491 1727203959.07549: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.07584: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.08014: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.08433: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 7491 1727203959.08437: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.08530: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.08630: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py <<< 7491 1727203959.08634: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.08706: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.08790: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available <<< 7491 1727203959.08925: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.09072: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py <<< 7491 1727203959.09089: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 7491 1727203959.09101: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.09132: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.09174: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 7491 1727203959.09189: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.09255: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.09339: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.09510: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.09675: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 7491 1727203959.09693: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.09724: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.09751: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available <<< 7491 1727203959.09773: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.09803: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available <<< 7491 1727203959.09860: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.09923: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available <<< 7491 1727203959.09963: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.09978: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available <<< 7491 1727203959.10021: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.10075: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available <<< 7491 1727203959.10125: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.10182: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 7491 1727203959.10195: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.10397: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.10606: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available <<< 7491 1727203959.10654: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.10708: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py <<< 7491 1727203959.10745: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.10748: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.10781: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 7491 1727203959.10808: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.10821: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.10847: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available <<< 7491 1727203959.10869: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.10900: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available <<< 7491 1727203959.10970: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.11063: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py <<< 7491 1727203959.11079: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available <<< 7491 1727203959.11116: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.11153: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py <<< 7491 1727203959.11188: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.11201: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7491 1727203959.11231: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.11284: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.11330: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.11404: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py <<< 7491 1727203959.11418: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available <<< 7491 1727203959.11462: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.11520: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available <<< 7491 1727203959.12293: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available <<< 7491 1727203959.12324: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available <<< 7491 1727203959.12419: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.12527: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py <<< 7491 1727203959.12537: stdout chunk (state=3): >>>import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 7491 1727203959.12621: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.12835: stdout chunk (state=3): >>>import 'gc' # <<< 7491 1727203959.13659: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py <<< 7491 1727203959.13666: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 7491 1727203959.13690: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 7491 1727203959.13698: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 7491 1727203959.13745: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' <<< 7491 1727203959.13749: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24196c17f0> <<< 7491 1727203959.13755: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24196c1790> <<< 7491 1727203959.13830: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241966e910> <<< 7491 1727203959.16852: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py <<< 7491 1727203959.16882: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' <<< 7491 1727203959.16906: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24196c1580> <<< 7491 1727203959.16938: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py <<< 7491 1727203959.16985: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' <<< 7491 1727203959.17012: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419683730> <<< 7491 1727203959.17083: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py <<< 7491 1727203959.17091: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' <<< 7491 1727203959.17131: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py <<< 7491 1727203959.17140: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' <<< 7491 1727203959.17160: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24194d5280> <<< 7491 1727203959.17179: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24194d5070> <<< 7491 1727203959.17603: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame <<< 7491 1727203959.17612: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 7491 1727203959.42598: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_iscsi_iqn": "", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOa<<< 7491 1727203959.42651: stdout chunk (state=3): >>>Aucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2880, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 652, "free": 2880}, "nocache": {"free": 3321, "used": 211}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 305, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264287748096, "block_size": 4096, "block_total": 65519355, "block_available": 64523376, "block_used": 995979, "inode_total": 131071472, "inode_available": 130998351, "inode_used": 73121, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_local<<< 7491 1727203959.42678: stdout chunk (state=3): >>>": {}, "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "52", "second": "39", "epoch": "1727203959", "epoch_int": "1727203959", "date": "2024-09-24", "time": "14:52:39", "iso8601_micro": "2024-09-24T18:52:39.375274Z", "iso8601": "2024-09-24T18:52:39Z", "iso8601_basic": "20240924T145239375274", "iso8601_basic_short": "20240924T145239", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_is_chroot": false, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_s<<< 7491 1727203959.42682: stdout chunk (state=3): >>>egmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_service_mgr": "systemd", "ansible_loadavg": {"1m": 0.46, "5m": 0.21, "15m": 0.09}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 7491 1727203959.43351: stdout chunk (state=3): >>># clear builtins._ <<< 7491 1727203959.43460: stdout chunk (state=3): >>># clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache <<< 7491 1727203959.43483: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin <<< 7491 1727203959.43491: stdout chunk (state=3): >>># restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix <<< 7491 1727203959.43494: stdout chunk (state=3): >>># cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases<<< 7491 1727203959.43500: stdout chunk (state=3): >>> # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat <<< 7491 1727203959.43505: stdout chunk (state=3): >>># cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale<<< 7491 1727203959.43511: stdout chunk (state=3): >>> # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre <<< 7491 1727203959.43515: stdout chunk (state=3): >>># cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword<<< 7491 1727203959.43523: stdout chunk (state=3): >>> # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools <<< 7491 1727203959.43662: stdout chunk (state=3): >>># cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading <<< 7491 1727203959.43871: stdout chunk (state=3): >>># cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random <<< 7491 1727203959.43876: stdout chunk (state=3): >>># destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible<<< 7491 1727203959.43879: stdout chunk (state=3): >>> # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder <<< 7491 1727203959.43881: stdout chunk (state=3): >>># cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess<<< 7491 1727203959.43884: stdout chunk (state=3): >>> # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd <<< 7491 1727203959.43886: stdout chunk (state=3): >>># cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid <<< 7491 1727203959.43888: stdout chunk (state=3): >>># cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging<<< 7491 1727203959.43891: stdout chunk (state=3): >>> # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat <<< 7491 1727203959.43893: stdout chunk (state=3): >>># destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text <<< 7491 1727203959.43895: stdout chunk (state=3): >>># destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters <<< 7491 1727203959.43897: stdout chunk (state=3): >>># destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections <<< 7491 1727203959.43899: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing <<< 7491 1727203959.43901: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters <<< 7491 1727203959.43904: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4<<< 7491 1727203959.43912: stdout chunk (state=3): >>> # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux<<< 7491 1727203959.43914: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext <<< 7491 1727203959.43915: stdout chunk (state=3): >>># cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace <<< 7491 1727203959.43929: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle <<< 7491 1727203959.43942: stdout chunk (state=3): >>># cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue<<< 7491 1727203959.43949: stdout chunk (state=3): >>> # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector <<< 7491 1727203959.43951: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system <<< 7491 1727203959.43959: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob <<< 7491 1727203959.43969: stdout chunk (state=3): >>># cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl <<< 7491 1727203959.43974: stdout chunk (state=3): >>># destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr <<< 7491 1727203959.43978: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl <<< 7491 1727203959.43985: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd <<< 7491 1727203959.44004: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep <<< 7491 1727203959.44029: stdout chunk (state=3): >>># cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 7491 1727203959.44458: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 7491 1727203959.44548: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 7491 1727203959.44655: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux <<< 7491 1727203959.44755: stdout chunk (state=3): >>># destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction <<< 7491 1727203959.44956: stdout chunk (state=3): >>># destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct <<< 7491 1727203959.45022: stdout chunk (state=3): >>># destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap <<< 7491 1727203959.45036: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator <<< 7491 1727203959.45061: stdout chunk (state=3): >>># cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io <<< 7491 1727203959.45107: stdout chunk (state=3): >>># destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 7491 1727203959.45123: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 7491 1727203959.45501: stdout chunk (state=3): >>># destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 7491 1727203959.45524: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks <<< 7491 1727203959.45984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203959.45994: stdout chunk (state=3): >>><<< 7491 1727203959.46006: stderr chunk (state=3): >>><<< 7491 1727203959.46191: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241ab1edc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241aac33a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241ab1eb20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241ab1eac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241aac3490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241aac3940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241aac3670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241aa7a190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241aa7a220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241aa9d850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241aa7a940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241aadb880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241aa73d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241aa9dd90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241aac3970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a7d2f10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a7d90a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a7cc5b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a7d36a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a7d23d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f241a690e50> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a690940> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a690f40> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a690d90> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a6a1100> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a7aedc0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a7a76a0> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a7ba700> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a7daeb0> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f241a6a1d00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a7ae2e0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f241a7ba310> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a7e0a60> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a6a1ee0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a6a1e20> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a6a1d90> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a674400> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a6744f0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a6a9f70> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a6a3ac0> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a6a3490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a5c2250> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a65f550> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a6a3f40> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a7e00d0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a5d4b80> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f241a5d4eb0> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a5e57c0> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a5e5d00> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f241a57f430> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a5d4fa0> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f241a58f310> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a5e5640> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f241a58f3d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a6a1a60> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f241a5ab730> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f241a5aba00> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a5ab7f0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f241a5ab8e0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f241a5abd30> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f241a5b5280> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a5ab970> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a59eac0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a6a1640> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241a5abb20> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f2419fe6700> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f25850> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419f25160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f25280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f25fa0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f254f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f25dc0> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419f25580> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f25100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419efa0a0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419dff370> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419dff070> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419dffcd0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f0ddc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f0d3a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f0df40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f5af40> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f27d60> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f27430> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419ed8af0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419f27550> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f27580> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419e6dfa0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f6c280> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419e6a820> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f6c400> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f6cc40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419e6a7c0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419f051c0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419f6c9d0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419f6c550> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f65940> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419e5f910> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419e7cdc0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419e69550> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419e5feb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419e69970> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419ea57f0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419eaa8b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24199fb940> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419ee3730> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419f282e0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419e9c880> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419877550> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419ead910> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419ef7970> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419ee1850> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241977cc70> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24199dca30> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24199dc9a0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419a28b20> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419a28550> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419a102e0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419a10970> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24199c12b0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24199c1a00> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24199c1940> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24197dd0d0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419e993a0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419a10670> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24196cceb0> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24196cc9d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2419738bb0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24196f5a60> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f241973f040> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241973f6d0> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_unhffvq2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available import 'gc' # # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24196c17f0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24196c1790> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f241966e910> # /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24196c1580> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2419683730> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24194d5280> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24194d5070> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_iscsi_iqn": "", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2880, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 652, "free": 2880}, "nocache": {"free": 3321, "used": 211}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 305, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264287748096, "block_size": 4096, "block_total": 65519355, "block_available": 64523376, "block_used": 995979, "inode_total": 131071472, "inode_available": 130998351, "inode_used": 73121, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "52", "second": "39", "epoch": "1727203959", "epoch_int": "1727203959", "date": "2024-09-24", "time": "14:52:39", "iso8601_micro": "2024-09-24T18:52:39.375274Z", "iso8601": "2024-09-24T18:52:39Z", "iso8601_basic": "20240924T145239375274", "iso8601_basic_short": "20240924T145239", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_is_chroot": false, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_service_mgr": "systemd", "ansible_loadavg": {"1m": 0.46, "5m": 0.21, "15m": 0.09}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks [WARNING]: Platform linux on host managed-node3 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 7491 1727203959.47929: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203958.1440659-7548-112931182985139/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203959.47955: _low_level_execute_command(): starting 7491 1727203959.47958: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203958.1440659-7548-112931182985139/ > /dev/null 2>&1 && sleep 0' 7491 1727203959.48747: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203959.48753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203959.48802: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203959.48824: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203959.48830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203959.48918: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203959.48941: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203959.48981: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203959.51353: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203959.51405: stderr chunk (state=3): >>><<< 7491 1727203959.51408: stdout chunk (state=3): >>><<< 7491 1727203959.51422: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203959.51429: handler run complete 7491 1727203959.51504: variable 'ansible_facts' from source: unknown 7491 1727203959.51572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203959.51757: variable 'ansible_facts' from source: unknown 7491 1727203959.51819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203959.51896: attempt loop complete, returning result 7491 1727203959.51900: _execute() done 7491 1727203959.51902: dumping result to json 7491 1727203959.51923: done dumping result, returning 7491 1727203959.51931: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [0affcd87-79f5-0a4a-ad01-000000000155] 7491 1727203959.51935: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000155 7491 1727203959.52476: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000155 ok: [managed-node3] 7491 1727203959.52546: no more pending results, returning what we have 7491 1727203959.52548: results queue empty 7491 1727203959.52549: checking for any_errors_fatal 7491 1727203959.52550: done checking for any_errors_fatal 7491 1727203959.52550: checking for max_fail_percentage 7491 1727203959.52551: done checking for max_fail_percentage 7491 1727203959.52552: checking to see if all hosts have failed and the running result is not ok 7491 1727203959.52552: done checking to see if all hosts have failed 7491 1727203959.52553: getting the remaining hosts for this loop 7491 1727203959.52554: done getting the remaining hosts for this loop 7491 1727203959.52557: getting the next task for host managed-node3 7491 1727203959.52561: done getting next task for host managed-node3 7491 1727203959.52562: ^ task is: TASK: meta (flush_handlers) 7491 1727203959.52565: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203959.52567: getting variables 7491 1727203959.52568: in VariableManager get_vars() 7491 1727203959.52587: Calling all_inventory to load vars for managed-node3 7491 1727203959.52588: Calling groups_inventory to load vars for managed-node3 7491 1727203959.52591: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203959.52599: Calling all_plugins_play to load vars for managed-node3 7491 1727203959.52600: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203959.52603: Calling groups_plugins_play to load vars for managed-node3 7491 1727203959.52713: WORKER PROCESS EXITING 7491 1727203959.52727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203959.52843: done with get_vars() 7491 1727203959.52851: done getting variables 7491 1727203959.52899: in VariableManager get_vars() 7491 1727203959.52906: Calling all_inventory to load vars for managed-node3 7491 1727203959.52908: Calling groups_inventory to load vars for managed-node3 7491 1727203959.52910: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203959.52913: Calling all_plugins_play to load vars for managed-node3 7491 1727203959.52915: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203959.52919: Calling groups_plugins_play to load vars for managed-node3 7491 1727203959.53008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203959.53121: done with get_vars() 7491 1727203959.53132: done queuing things up, now waiting for results queue to drain 7491 1727203959.53133: results queue empty 7491 1727203959.53134: checking for any_errors_fatal 7491 1727203959.53135: done checking for any_errors_fatal 7491 1727203959.53136: checking for max_fail_percentage 7491 1727203959.53136: done checking for max_fail_percentage 7491 1727203959.53137: checking to see if all hosts have failed and the running result is not ok 7491 1727203959.53141: done checking to see if all hosts have failed 7491 1727203959.53141: getting the remaining hosts for this loop 7491 1727203959.53142: done getting the remaining hosts for this loop 7491 1727203959.53144: getting the next task for host managed-node3 7491 1727203959.53147: done getting next task for host managed-node3 7491 1727203959.53148: ^ task is: TASK: Include the task 'el_repo_setup.yml' 7491 1727203959.53149: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203959.53151: getting variables 7491 1727203959.53151: in VariableManager get_vars() 7491 1727203959.53156: Calling all_inventory to load vars for managed-node3 7491 1727203959.53157: Calling groups_inventory to load vars for managed-node3 7491 1727203959.53159: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203959.53162: Calling all_plugins_play to load vars for managed-node3 7491 1727203959.53165: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203959.53167: Calling groups_plugins_play to load vars for managed-node3 7491 1727203959.53249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203959.53355: done with get_vars() 7491 1727203959.53360: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_auto_gateway_nm.yml:11 Tuesday 24 September 2024 14:52:39 -0400 (0:00:01.445) 0:00:01.458 ***** 7491 1727203959.53415: entering _queue_task() for managed-node3/include_tasks 7491 1727203959.53418: Creating lock for include_tasks 7491 1727203959.53631: worker is 1 (out of 1 available) 7491 1727203959.53644: exiting _queue_task() for managed-node3/include_tasks 7491 1727203959.53656: done queuing things up, now waiting for results queue to drain 7491 1727203959.53658: waiting for pending results... 7491 1727203959.53798: running TaskExecutor() for managed-node3/TASK: Include the task 'el_repo_setup.yml' 7491 1727203959.53865: in run() - task 0affcd87-79f5-0a4a-ad01-000000000006 7491 1727203959.53879: variable 'ansible_search_path' from source: unknown 7491 1727203959.53907: calling self._execute() 7491 1727203959.53961: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203959.53967: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203959.53975: variable 'omit' from source: magic vars 7491 1727203959.54047: _execute() done 7491 1727203959.54050: dumping result to json 7491 1727203959.54053: done dumping result, returning 7491 1727203959.54056: done running TaskExecutor() for managed-node3/TASK: Include the task 'el_repo_setup.yml' [0affcd87-79f5-0a4a-ad01-000000000006] 7491 1727203959.54066: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000006 7491 1727203959.54153: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000006 7491 1727203959.54156: WORKER PROCESS EXITING 7491 1727203959.54202: no more pending results, returning what we have 7491 1727203959.54206: in VariableManager get_vars() 7491 1727203959.54234: Calling all_inventory to load vars for managed-node3 7491 1727203959.54236: Calling groups_inventory to load vars for managed-node3 7491 1727203959.54239: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203959.54248: Calling all_plugins_play to load vars for managed-node3 7491 1727203959.54250: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203959.54253: Calling groups_plugins_play to load vars for managed-node3 7491 1727203959.54381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203959.54488: done with get_vars() 7491 1727203959.54494: variable 'ansible_search_path' from source: unknown 7491 1727203959.54505: we have included files to process 7491 1727203959.54506: generating all_blocks data 7491 1727203959.54507: done generating all_blocks data 7491 1727203959.54507: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 7491 1727203959.54508: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 7491 1727203959.54510: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 7491 1727203959.54941: in VariableManager get_vars() 7491 1727203959.54951: done with get_vars() 7491 1727203959.54958: done processing included file 7491 1727203959.54959: iterating over new_blocks loaded from include file 7491 1727203959.54960: in VariableManager get_vars() 7491 1727203959.54967: done with get_vars() 7491 1727203959.54968: filtering new block on tags 7491 1727203959.54978: done filtering new block on tags 7491 1727203959.54980: in VariableManager get_vars() 7491 1727203959.54986: done with get_vars() 7491 1727203959.54987: filtering new block on tags 7491 1727203959.54995: done filtering new block on tags 7491 1727203959.54997: in VariableManager get_vars() 7491 1727203959.55017: done with get_vars() 7491 1727203959.55018: filtering new block on tags 7491 1727203959.55027: done filtering new block on tags 7491 1727203959.55028: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node3 7491 1727203959.55033: extending task lists for all hosts with included blocks 7491 1727203959.55066: done extending task lists 7491 1727203959.55067: done processing included files 7491 1727203959.55067: results queue empty 7491 1727203959.55068: checking for any_errors_fatal 7491 1727203959.55069: done checking for any_errors_fatal 7491 1727203959.55069: checking for max_fail_percentage 7491 1727203959.55070: done checking for max_fail_percentage 7491 1727203959.55070: checking to see if all hosts have failed and the running result is not ok 7491 1727203959.55071: done checking to see if all hosts have failed 7491 1727203959.55071: getting the remaining hosts for this loop 7491 1727203959.55072: done getting the remaining hosts for this loop 7491 1727203959.55074: getting the next task for host managed-node3 7491 1727203959.55076: done getting next task for host managed-node3 7491 1727203959.55077: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 7491 1727203959.55079: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203959.55080: getting variables 7491 1727203959.55081: in VariableManager get_vars() 7491 1727203959.55086: Calling all_inventory to load vars for managed-node3 7491 1727203959.55087: Calling groups_inventory to load vars for managed-node3 7491 1727203959.55088: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203959.55091: Calling all_plugins_play to load vars for managed-node3 7491 1727203959.55093: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203959.55094: Calling groups_plugins_play to load vars for managed-node3 7491 1727203959.55175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203959.55283: done with get_vars() 7491 1727203959.55289: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 14:52:39 -0400 (0:00:00.019) 0:00:01.477 ***** 7491 1727203959.55332: entering _queue_task() for managed-node3/setup 7491 1727203959.55521: worker is 1 (out of 1 available) 7491 1727203959.55534: exiting _queue_task() for managed-node3/setup 7491 1727203959.55546: done queuing things up, now waiting for results queue to drain 7491 1727203959.55547: waiting for pending results... 7491 1727203959.55703: running TaskExecutor() for managed-node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 7491 1727203959.55774: in run() - task 0affcd87-79f5-0a4a-ad01-000000000166 7491 1727203959.55783: variable 'ansible_search_path' from source: unknown 7491 1727203959.55786: variable 'ansible_search_path' from source: unknown 7491 1727203959.55818: calling self._execute() 7491 1727203959.55876: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203959.55880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203959.55886: variable 'omit' from source: magic vars 7491 1727203959.56257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203959.57797: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203959.57852: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203959.57881: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203959.57909: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203959.57930: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203959.57989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203959.58013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203959.58033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203959.58061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203959.58075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203959.58200: variable 'ansible_facts' from source: unknown 7491 1727203959.58248: variable 'network_test_required_facts' from source: task vars 7491 1727203959.58278: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 7491 1727203959.58284: variable 'omit' from source: magic vars 7491 1727203959.58312: variable 'omit' from source: magic vars 7491 1727203959.58340: variable 'omit' from source: magic vars 7491 1727203959.58359: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203959.58383: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203959.58400: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203959.58412: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203959.58423: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203959.58446: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203959.58449: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203959.58451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203959.58523: Set connection var ansible_timeout to 10 7491 1727203959.58529: Set connection var ansible_pipelining to False 7491 1727203959.58535: Set connection var ansible_shell_type to sh 7491 1727203959.58540: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203959.58550: Set connection var ansible_shell_executable to /bin/sh 7491 1727203959.58552: Set connection var ansible_connection to ssh 7491 1727203959.58571: variable 'ansible_shell_executable' from source: unknown 7491 1727203959.58573: variable 'ansible_connection' from source: unknown 7491 1727203959.58576: variable 'ansible_module_compression' from source: unknown 7491 1727203959.58579: variable 'ansible_shell_type' from source: unknown 7491 1727203959.58581: variable 'ansible_shell_executable' from source: unknown 7491 1727203959.58585: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203959.58587: variable 'ansible_pipelining' from source: unknown 7491 1727203959.58589: variable 'ansible_timeout' from source: unknown 7491 1727203959.58593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203959.58692: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7491 1727203959.58701: variable 'omit' from source: magic vars 7491 1727203959.58704: starting attempt loop 7491 1727203959.58707: running the handler 7491 1727203959.58723: _low_level_execute_command(): starting 7491 1727203959.58729: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203959.59252: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203959.59269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203959.59295: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203959.59309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203959.59320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203959.59375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203959.59386: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203959.59443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203959.61581: stdout chunk (state=3): >>>/root <<< 7491 1727203959.61723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203959.61790: stderr chunk (state=3): >>><<< 7491 1727203959.61793: stdout chunk (state=3): >>><<< 7491 1727203959.61814: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203959.61831: _low_level_execute_command(): starting 7491 1727203959.61834: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203959.6181378-7647-66951561288669 `" && echo ansible-tmp-1727203959.6181378-7647-66951561288669="` echo /root/.ansible/tmp/ansible-tmp-1727203959.6181378-7647-66951561288669 `" ) && sleep 0' 7491 1727203959.62306: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203959.62316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203959.62351: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203959.62363: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203959.62422: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203959.62442: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203959.62490: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203959.65101: stdout chunk (state=3): >>>ansible-tmp-1727203959.6181378-7647-66951561288669=/root/.ansible/tmp/ansible-tmp-1727203959.6181378-7647-66951561288669 <<< 7491 1727203959.65270: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203959.65335: stderr chunk (state=3): >>><<< 7491 1727203959.65341: stdout chunk (state=3): >>><<< 7491 1727203959.65359: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203959.6181378-7647-66951561288669=/root/.ansible/tmp/ansible-tmp-1727203959.6181378-7647-66951561288669 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203959.65407: variable 'ansible_module_compression' from source: unknown 7491 1727203959.65451: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 7491 1727203959.65503: variable 'ansible_facts' from source: unknown 7491 1727203959.65622: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203959.6181378-7647-66951561288669/AnsiballZ_setup.py 7491 1727203959.65872: Sending initial data 7491 1727203959.65875: Sent initial data (151 bytes) 7491 1727203959.66798: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203959.66809: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203959.66821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203959.66835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203959.66875: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203959.66882: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203959.66892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203959.66906: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203959.66914: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203959.66922: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203959.66927: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203959.66936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203959.66947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203959.66955: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203959.66961: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203959.66972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203959.67047: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203959.67095: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203959.67098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203959.67414: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203959.69763: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203959.69802: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203959.69842: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpj6cw3vpy /root/.ansible/tmp/ansible-tmp-1727203959.6181378-7647-66951561288669/AnsiballZ_setup.py <<< 7491 1727203959.69883: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203959.73087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203959.73186: stderr chunk (state=3): >>><<< 7491 1727203959.73190: stdout chunk (state=3): >>><<< 7491 1727203959.73275: done transferring module to remote 7491 1727203959.73278: _low_level_execute_command(): starting 7491 1727203959.73280: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203959.6181378-7647-66951561288669/ /root/.ansible/tmp/ansible-tmp-1727203959.6181378-7647-66951561288669/AnsiballZ_setup.py && sleep 0' 7491 1727203959.73956: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203959.73967: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203959.73981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203959.74031: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203959.74035: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203959.74037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203959.74091: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203959.74094: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203959.74141: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7491 1727203959.76477: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203959.76558: stderr chunk (state=3): >>><<< 7491 1727203959.76562: stdout chunk (state=3): >>><<< 7491 1727203959.76669: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7491 1727203959.76673: _low_level_execute_command(): starting 7491 1727203959.76677: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203959.6181378-7647-66951561288669/AnsiballZ_setup.py && sleep 0' 7491 1727203959.77347: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203959.77371: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203959.77387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203959.77405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203959.77458: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203959.77477: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203959.77492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203959.77510: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203959.77525: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203959.77535: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203959.77544: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203959.77554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203959.77579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203959.77590: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203959.77599: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203959.77610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203959.77694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203959.77712: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203959.77728: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203959.77809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203959.79684: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # <<< 7491 1727203959.79690: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 7491 1727203959.79738: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 7491 1727203959.79773: stdout chunk (state=3): >>>import 'posix' # <<< 7491 1727203959.79809: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 7491 1727203959.79821: stdout chunk (state=3): >>># installing zipimport hook <<< 7491 1727203959.79847: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 7491 1727203959.79894: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 7491 1727203959.79930: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 7491 1727203959.79951: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 7491 1727203959.79966: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08a43dc0> <<< 7491 1727203959.80005: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' <<< 7491 1727203959.80028: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f087d83a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08a43b20> <<< 7491 1727203959.80044: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 7491 1727203959.80072: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08a43ac0> import '_signal' # <<< 7491 1727203959.80101: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f087d8490> <<< 7491 1727203959.80141: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 7491 1727203959.80173: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # <<< 7491 1727203959.80192: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f087d8940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f087d8670> <<< 7491 1727203959.80228: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 7491 1727203959.80261: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 7491 1727203959.80283: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 7491 1727203959.80308: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 7491 1727203959.80322: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 7491 1727203959.80342: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0878f190> <<< 7491 1727203959.80369: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 7491 1727203959.80372: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 7491 1727203959.80440: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0878f220> <<< 7491 1727203959.80470: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 7491 1727203959.80500: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f087b2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0878f940> <<< 7491 1727203959.80549: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f087f0880> <<< 7491 1727203959.80567: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py <<< 7491 1727203959.80570: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08788d90> <<< 7491 1727203959.80627: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # <<< 7491 1727203959.80631: stdout chunk (state=3): >>>import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f087b2d90> <<< 7491 1727203959.80675: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f087d8970> <<< 7491 1727203959.80698: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 7491 1727203959.81031: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 7491 1727203959.81063: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 7491 1727203959.81078: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 7491 1727203959.81096: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 7491 1727203959.81121: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 7491 1727203959.81148: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0872ef10> <<< 7491 1727203959.81187: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f087340a0> <<< 7491 1727203959.81210: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 7491 1727203959.81217: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 7491 1727203959.81230: stdout chunk (state=3): >>>import '_sre' # <<< 7491 1727203959.81244: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 7491 1727203959.81288: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 7491 1727203959.81291: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 7491 1727203959.81309: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f087275b0> <<< 7491 1727203959.81324: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0872f6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0872e3d0> <<< 7491 1727203959.81349: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 7491 1727203959.81414: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 7491 1727203959.81435: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 7491 1727203959.81459: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 7491 1727203959.81484: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 7491 1727203959.81527: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f08615eb0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f086159a0> <<< 7491 1727203959.81559: stdout chunk (state=3): >>>import 'itertools' # <<< 7491 1727203959.81571: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08615fa0> <<< 7491 1727203959.81595: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 7491 1727203959.81635: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08615df0> <<< 7491 1727203959.81658: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py <<< 7491 1727203959.81661: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08625160> import '_collections' # <<< 7491 1727203959.81708: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08709e20> <<< 7491 1727203959.81711: stdout chunk (state=3): >>>import '_functools' # <<< 7491 1727203959.81732: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08701700> <<< 7491 1727203959.81792: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08715760> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08735eb0> <<< 7491 1727203959.81813: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 7491 1727203959.81841: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f08625d60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08709340> <<< 7491 1727203959.81901: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 7491 1727203959.81905: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f08715370> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0873ba60> <<< 7491 1727203959.81933: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 7491 1727203959.81954: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 7491 1727203959.82000: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08625f40> <<< 7491 1727203959.82028: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08625e80> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08625df0> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 7491 1727203959.82060: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 7491 1727203959.82063: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 7491 1727203959.82091: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 7491 1727203959.82133: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 7491 1727203959.82172: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f085f9460> <<< 7491 1727203959.82189: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 7491 1727203959.82217: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f085f9550> <<< 7491 1727203959.82334: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f085d70d0> <<< 7491 1727203959.82395: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08628b20> <<< 7491 1727203959.82401: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f086284c0> <<< 7491 1727203959.82419: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 7491 1727203959.82451: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 7491 1727203959.82486: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' <<< 7491 1727203959.82498: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0852d2b0> <<< 7491 1727203959.82522: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f085e4d60> <<< 7491 1727203959.82568: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08628fa0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0873b0d0> <<< 7491 1727203959.82593: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 7491 1727203959.82611: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 7491 1727203959.82654: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0853dbe0> import 'errno' # <<< 7491 1727203959.82697: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f0853df10> <<< 7491 1727203959.82725: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 7491 1727203959.82736: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08550820> <<< 7491 1727203959.82768: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 7491 1727203959.82789: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 7491 1727203959.82813: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08550d60> <<< 7491 1727203959.82895: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f084e9490> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0853df40> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 7491 1727203959.82955: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f084f9370> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f085506a0> import 'pwd' # <<< 7491 1727203959.83060: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f084f9430> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08625ac0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 7491 1727203959.83074: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 7491 1727203959.83779: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f08515790> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f08515a60> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08515850> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f08515940> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f08515d90> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f085202e0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f085159d0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08509b20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f086256a0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08515b80> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f9f0843e760> <<< 7491 1727203959.84069: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available <<< 7491 1727203959.84108: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/__init__.py <<< 7491 1727203959.84124: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.85311: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.86237: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0837b8b0> <<< 7491 1727203959.86297: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 7491 1727203959.86336: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f0837b160> <<< 7491 1727203959.86350: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0837b280> <<< 7491 1727203959.86405: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0837b5e0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 7491 1727203959.86446: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0837b4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0837be20> import 'atexit' # <<< 7491 1727203959.86492: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f0837b580> <<< 7491 1727203959.86525: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 7491 1727203959.86579: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0837b100> <<< 7491 1727203959.86583: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 7491 1727203959.86627: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 7491 1727203959.86637: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 7491 1727203959.86659: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 7491 1727203959.86743: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07dc0040> <<< 7491 1727203959.86770: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f07d093d0> <<< 7491 1727203959.86800: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' <<< 7491 1727203959.86826: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f07d090d0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 7491 1727203959.86873: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07d09d30> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08363d90> <<< 7491 1727203959.87045: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f083633a0> <<< 7491 1727203959.87088: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py <<< 7491 1727203959.87104: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08363f40> <<< 7491 1727203959.87119: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 7491 1727203959.87149: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 7491 1727203959.87180: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 7491 1727203959.87202: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0843ea90> <<< 7491 1727203959.87257: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07deedc0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07dee490> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08378a90> <<< 7491 1727203959.87295: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f07dee5b0> <<< 7491 1727203959.87343: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07dee5e0> <<< 7491 1727203959.87368: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 7491 1727203959.87380: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 7491 1727203959.87403: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 7491 1727203959.87510: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f07d74f70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f083c52e0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 7491 1727203959.87553: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f07d717f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f083c5460> <<< 7491 1727203959.87578: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 7491 1727203959.87646: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 7491 1727203959.87691: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f083c5c40> <<< 7491 1727203959.87965: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07d71790> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f083c5130> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f083c5670> <<< 7491 1727203959.87992: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f083c5730> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f083bd9a0> <<< 7491 1727203959.88037: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 7491 1727203959.88040: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 7491 1727203959.88083: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' <<< 7491 1727203959.88087: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f07d678e0> <<< 7491 1727203959.88257: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f07d85c70> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07d70520> <<< 7491 1727203959.88318: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f07d67e80> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07d70940> <<< 7491 1727203959.88322: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7491 1727203959.88336: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available <<< 7491 1727203959.88404: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.88518: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py <<< 7491 1727203959.88532: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available <<< 7491 1727203959.88631: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.88718: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.89156: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.89619: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py <<< 7491 1727203959.89623: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 7491 1727203959.89626: stdout chunk (state=3): >>>import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 7491 1727203959.89647: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 7491 1727203959.89703: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' <<< 7491 1727203959.89706: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f07d80790> <<< 7491 1727203959.89768: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07dbf850> <<< 7491 1727203959.89784: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07945fa0> <<< 7491 1727203959.89835: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 7491 1727203959.89838: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.89868: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.89871: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 7491 1727203959.89986: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.90112: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 7491 1727203959.90140: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07df3310> # zipimport: zlib available <<< 7491 1727203959.90530: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.90900: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.90958: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.91030: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available <<< 7491 1727203959.91075: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.91101: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available <<< 7491 1727203959.91153: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.91239: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available <<< 7491 1727203959.91243: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 7491 1727203959.91266: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.91278: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.91314: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available <<< 7491 1727203959.91495: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.91680: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 7491 1727203959.91714: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # <<< 7491 1727203959.91786: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08381ca0> # zipimport: zlib available <<< 7491 1727203959.91851: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.91921: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 7491 1727203959.91939: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.91980: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.92020: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/common/locale.py <<< 7491 1727203959.92027: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.92045: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.92094: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.92181: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.92238: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 7491 1727203959.92258: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 7491 1727203959.92327: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f07da3c70> <<< 7491 1727203959.92414: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08381bb0> <<< 7491 1727203959.92444: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 7491 1727203959.92502: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.92546: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.92570: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.92617: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 7491 1727203959.92641: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 7491 1727203959.92670: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 7491 1727203959.92691: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 7491 1727203959.92709: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 7491 1727203959.92788: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07d832b0> <<< 7491 1727203959.92829: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0834fb80> <<< 7491 1727203959.92882: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f077a0160> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 7491 1727203959.92905: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.92937: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 7491 1727203959.93017: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/basic.py <<< 7491 1727203959.93049: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available <<< 7491 1727203959.93090: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.93156: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.93182: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.93195: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.93220: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.93249: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.93277: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.93312: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available <<< 7491 1727203959.93386: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.93441: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.93461: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.93502: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 7491 1727203959.93636: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.93782: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.93807: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.93854: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 7491 1727203959.93897: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 7491 1727203959.93914: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 7491 1727203959.93958: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f076a2100> <<< 7491 1727203959.93985: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 7491 1727203959.94007: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 7491 1727203959.94051: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07908a60> <<< 7491 1727203959.94091: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f079089d0> <<< 7491 1727203959.94166: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f078dac70> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f078dac10> <<< 7491 1727203959.94193: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07924460> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f079243d0> <<< 7491 1727203959.94225: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 7491 1727203959.94253: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 7491 1727203959.94291: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 7491 1727203959.94303: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f078ea310> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f078ea9a0> <<< 7491 1727203959.94332: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 7491 1727203959.94351: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f078ea940> <<< 7491 1727203959.94387: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 7491 1727203959.94409: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 7491 1727203959.94425: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f077040d0> <<< 7491 1727203959.94473: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f083cdc40> <<< 7491 1727203959.94517: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07924790> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 7491 1727203959.94530: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.94581: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.94636: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py <<< 7491 1727203959.94640: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.94669: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.94750: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available <<< 7491 1727203959.94753: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available <<< 7491 1727203959.94780: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.94814: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available <<< 7491 1727203959.94851: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.94893: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available <<< 7491 1727203959.94940: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.94980: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available <<< 7491 1727203959.95029: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.95083: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.95124: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.95198: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available <<< 7491 1727203959.95748: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.96524: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available<<< 7491 1727203959.96567: stdout chunk (state=3): >>> # zipimport: zlib available <<< 7491 1727203959.96614: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 7491 1727203959.96643: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.96727: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.96812: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 7491 1727203959.96843: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.96895: stdout chunk (state=3): >>># zipimport: zlib available<<< 7491 1727203959.96901: stdout chunk (state=3): >>> <<< 7491 1727203959.96940: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py <<< 7491 1727203959.96972: stdout chunk (state=3): >>># zipimport: zlib available<<< 7491 1727203959.96977: stdout chunk (state=3): >>> <<< 7491 1727203959.97022: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.97073: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py<<< 7491 1727203959.97079: stdout chunk (state=3): >>> <<< 7491 1727203959.97110: stdout chunk (state=3): >>># zipimport: zlib available<<< 7491 1727203959.97119: stdout chunk (state=3): >>> <<< 7491 1727203959.97220: stdout chunk (state=3): >>># zipimport: zlib available<<< 7491 1727203959.97227: stdout chunk (state=3): >>> <<< 7491 1727203959.97337: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py<<< 7491 1727203959.97342: stdout chunk (state=3): >>> <<< 7491 1727203959.97367: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc'<<< 7491 1727203959.97371: stdout chunk (state=3): >>> <<< 7491 1727203959.97411: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f075f4f10><<< 7491 1727203959.97419: stdout chunk (state=3): >>> <<< 7491 1727203959.97462: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py<<< 7491 1727203959.97468: stdout chunk (state=3): >>> <<< 7491 1727203959.97507: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc'<<< 7491 1727203959.97512: stdout chunk (state=3): >>> <<< 7491 1727203959.97773: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f075f49d0><<< 7491 1727203959.97779: stdout chunk (state=3): >>> <<< 7491 1727203959.97797: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py<<< 7491 1727203959.97802: stdout chunk (state=3): >>> <<< 7491 1727203959.97830: stdout chunk (state=3): >>># zipimport: zlib available<<< 7491 1727203959.97839: stdout chunk (state=3): >>> <<< 7491 1727203959.97922: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.97970: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 7491 1727203959.98005: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.98080: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.98298: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available <<< 7491 1727203959.98384: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.98482: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available <<< 7491 1727203959.98572: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 7491 1727203959.98753: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f07616c10> <<< 7491 1727203959.98919: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07666c40> <<< 7491 1727203959.98931: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 7491 1727203959.98939: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.99014: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.99058: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 7491 1727203959.99081: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.99138: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.99213: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.99311: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.99450: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py <<< 7491 1727203959.99453: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.99493: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.99531: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 7491 1727203959.99537: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.99571: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.99623: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 7491 1727203959.99677: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f076685e0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07668790> <<< 7491 1727203959.99705: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py <<< 7491 1727203959.99708: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7491 1727203959.99712: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py <<< 7491 1727203959.99731: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.99754: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203959.99808: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available <<< 7491 1727203959.99934: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.00510: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available <<< 7491 1727203960.00629: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7491 1727203960.00745: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.00939: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py <<< 7491 1727203960.00942: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py <<< 7491 1727203960.00948: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.01113: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.01352: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 7491 1727203960.01356: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.01401: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.01444: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.02179: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.02728: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 7491 1727203960.02736: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.02829: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.02930: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py <<< 7491 1727203960.02933: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.03009: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.03102: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available <<< 7491 1727203960.03237: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.03772: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 7491 1727203960.04045: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.04320: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 7491 1727203960.04329: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.04380: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.04448: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 7491 1727203960.04451: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.04466: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.04503: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py <<< 7491 1727203960.04506: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.04598: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.04694: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 7491 1727203960.04700: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.04730: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.04752: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py <<< 7491 1727203960.04778: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.04839: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.05385: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available <<< 7491 1727203960.05448: stdout chunk (state=3): >>># zipimport: zlib available<<< 7491 1727203960.05452: stdout chunk (state=3): >>> <<< 7491 1727203960.05835: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py<<< 7491 1727203960.05849: stdout chunk (state=3): >>> <<< 7491 1727203960.05866: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.05941: stdout chunk (state=3): >>># zipimport: zlib available<<< 7491 1727203960.05943: stdout chunk (state=3): >>> <<< 7491 1727203960.06028: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py<<< 7491 1727203960.06030: stdout chunk (state=3): >>> <<< 7491 1727203960.06056: stdout chunk (state=3): >>># zipimport: zlib available<<< 7491 1727203960.06058: stdout chunk (state=3): >>> <<< 7491 1727203960.06100: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.06136: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py<<< 7491 1727203960.06139: stdout chunk (state=3): >>> <<< 7491 1727203960.06157: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.06209: stdout chunk (state=3): >>># zipimport: zlib available<<< 7491 1727203960.06211: stdout chunk (state=3): >>> <<< 7491 1727203960.06251: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py<<< 7491 1727203960.06253: stdout chunk (state=3): >>> <<< 7491 1727203960.06279: stdout chunk (state=3): >>># zipimport: zlib available<<< 7491 1727203960.06281: stdout chunk (state=3): >>> <<< 7491 1727203960.06324: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.06376: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py<<< 7491 1727203960.06382: stdout chunk (state=3): >>> <<< 7491 1727203960.06400: stdout chunk (state=3): >>># zipimport: zlib available<<< 7491 1727203960.06402: stdout chunk (state=3): >>> <<< 7491 1727203960.06501: stdout chunk (state=3): >>># zipimport: zlib available<<< 7491 1727203960.06506: stdout chunk (state=3): >>> <<< 7491 1727203960.06607: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py<<< 7491 1727203960.06612: stdout chunk (state=3): >>> <<< 7491 1727203960.06635: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.06659: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.06675: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py<<< 7491 1727203960.06680: stdout chunk (state=3): >>> <<< 7491 1727203960.06703: stdout chunk (state=3): >>># zipimport: zlib available<<< 7491 1727203960.06708: stdout chunk (state=3): >>> <<< 7491 1727203960.06761: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.06821: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py<<< 7491 1727203960.06827: stdout chunk (state=3): >>> <<< 7491 1727203960.06843: stdout chunk (state=3): >>># zipimport: zlib available<<< 7491 1727203960.06870: stdout chunk (state=3): >>> <<< 7491 1727203960.06907: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.06920: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.06980: stdout chunk (state=3): >>># zipimport: zlib available<<< 7491 1727203960.07004: stdout chunk (state=3): >>> <<< 7491 1727203960.07046: stdout chunk (state=3): >>># zipimport: zlib available<<< 7491 1727203960.07051: stdout chunk (state=3): >>> <<< 7491 1727203960.07151: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.07252: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py<<< 7491 1727203960.07274: stdout chunk (state=3): >>> import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py <<< 7491 1727203960.07291: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 7491 1727203960.07309: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.07367: stdout chunk (state=3): >>># zipimport: zlib available<<< 7491 1727203960.07382: stdout chunk (state=3): >>> <<< 7491 1727203960.07443: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py <<< 7491 1727203960.07465: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.07742: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.07996: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 7491 1727203960.08018: stdout chunk (state=3): >>># zipimport: zlib available<<< 7491 1727203960.08021: stdout chunk (state=3): >>> <<< 7491 1727203960.08092: stdout chunk (state=3): >>># zipimport: zlib available<<< 7491 1727203960.08096: stdout chunk (state=3): >>> <<< 7491 1727203960.08185: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py<<< 7491 1727203960.08188: stdout chunk (state=3): >>> <<< 7491 1727203960.08191: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.08235: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.08330: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 7491 1727203960.08333: stdout chunk (state=3): >>># zipimport: zlib available<<< 7491 1727203960.08336: stdout chunk (state=3): >>> <<< 7491 1727203960.08442: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.08540: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py <<< 7491 1727203960.08544: stdout chunk (state=3): >>>import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 7491 1727203960.08575: stdout chunk (state=3): >>># zipimport: zlib available<<< 7491 1727203960.08585: stdout chunk (state=3): >>> <<< 7491 1727203960.08683: stdout chunk (state=3): >>># zipimport: zlib available<<< 7491 1727203960.08688: stdout chunk (state=3): >>> <<< 7491 1727203960.08798: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py<<< 7491 1727203960.08814: stdout chunk (state=3): >>> import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py <<< 7491 1727203960.08827: stdout chunk (state=3): >>>import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py<<< 7491 1727203960.08833: stdout chunk (state=3): >>> <<< 7491 1727203960.08933: stdout chunk (state=3): >>># zipimport: zlib available<<< 7491 1727203960.08942: stdout chunk (state=3): >>> <<< 7491 1727203960.09876: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py <<< 7491 1727203960.09891: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 7491 1727203960.09916: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 7491 1727203960.09934: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 7491 1727203960.09973: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' <<< 7491 1727203960.09983: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f07408d90> <<< 7491 1727203960.09990: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f073da070> <<< 7491 1727203960.10073: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f073da1c0> <<< 7491 1727203960.10694: stdout chunk (state=3): >>>import 'gc' # <<< 7491 1727203960.11247: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "52", "second": "40", "epoch": "1727203960", "epoch_int": "1727203960", "date": "2024-09-24", "time": "14:52:40", "iso8601_micro": "2024-09-24T18:52:40.091161Z", "iso8601": "2024-09-24T18:52:40Z", "iso8601_basic": "20240924T145240091161", "iso8601_basic_short": "20240924T145240", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_pa<<< 7491 1727203960.11261: stdout chunk (state=3): >>>rsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 7491 1727203960.11921: stdout chunk (state=3): >>># clear builtins._ # clear sys.path <<< 7491 1727203960.12005: stdout chunk (state=3): >>># clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ <<< 7491 1727203960.12053: stdout chunk (state=3): >>># restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc <<< 7491 1727203960.12099: stdout chunk (state=3): >>># cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq <<< 7491 1727203960.12105: stdout chunk (state=3): >>># cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct <<< 7491 1727203960.12111: stdout chunk (state=3): >>># cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading <<< 7491 1727203960.12121: stdout chunk (state=3): >>># cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile <<< 7491 1727203960.12124: stdout chunk (state=3): >>># cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl <<< 7491 1727203960.12126: stdout chunk (state=3): >>># cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string <<< 7491 1727203960.12129: stdout chunk (state=3): >>># cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common <<< 7491 1727203960.12131: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text <<< 7491 1727203960.12132: stdout chunk (state=3): >>># cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool <<< 7491 1727203960.12133: stdout chunk (state=3): >>># destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro <<< 7491 1727203960.12137: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 7491 1727203960.12139: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system <<< 7491 1727203960.12140: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl <<< 7491 1727203960.12141: stdout chunk (state=3): >>># destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd <<< 7491 1727203960.12142: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual <<< 7491 1727203960.12144: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other <<< 7491 1727203960.12145: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb <<< 7491 1727203960.12147: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc <<< 7491 1727203960.12485: stdout chunk (state=3): >>># destroy _sitebuiltins<<< 7491 1727203960.12495: stdout chunk (state=3): >>> <<< 7491 1727203960.12512: stdout chunk (state=3): >>># destroy importlib.util <<< 7491 1727203960.12519: stdout chunk (state=3): >>># destroy importlib.abc # destroy importlib.machinery <<< 7491 1727203960.12554: stdout chunk (state=3): >>># destroy zipimport <<< 7491 1727203960.12560: stdout chunk (state=3): >>># destroy _compression <<< 7491 1727203960.12587: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 7491 1727203960.12621: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 7491 1727203960.12639: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json <<< 7491 1727203960.12643: stdout chunk (state=3): >>># destroy encodings <<< 7491 1727203960.12670: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 7491 1727203960.12723: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 7491 1727203960.12778: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors <<< 7491 1727203960.12791: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array <<< 7491 1727203960.12803: stdout chunk (state=3): >>># destroy _compat_pickle <<< 7491 1727203960.12824: stdout chunk (state=3): >>># destroy queue <<< 7491 1727203960.12829: stdout chunk (state=3): >>># destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction <<< 7491 1727203960.12860: stdout chunk (state=3): >>># destroy shlex <<< 7491 1727203960.12874: stdout chunk (state=3): >>># destroy datetime <<< 7491 1727203960.12885: stdout chunk (state=3): >>># destroy base64 <<< 7491 1727203960.12906: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux <<< 7491 1727203960.12921: stdout chunk (state=3): >>># destroy getpass # destroy json <<< 7491 1727203960.12951: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 7491 1727203960.12958: stdout chunk (state=3): >>># destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 7491 1727203960.13030: stdout chunk (state=3): >>># cleanup[3] wiping gc # cleanup[3] wiping encodings.idna <<< 7491 1727203960.13080: stdout chunk (state=3): >>># destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux <<< 7491 1727203960.13108: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc <<< 7491 1727203960.13112: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback <<< 7491 1727203960.13118: stdout chunk (state=3): >>># destroy linecache # cleanup[3] wiping tokenize <<< 7491 1727203960.13124: stdout chunk (state=3): >>># cleanup[3] wiping platform # destroy subprocess <<< 7491 1727203960.13133: stdout chunk (state=3): >>># cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal <<< 7491 1727203960.13146: stdout chunk (state=3): >>># cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil <<< 7491 1727203960.13149: stdout chunk (state=3): >>># destroy fnmatch <<< 7491 1727203960.13156: stdout chunk (state=3): >>># cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings <<< 7491 1727203960.13160: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external <<< 7491 1727203960.13217: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse <<< 7491 1727203960.13224: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os <<< 7491 1727203960.13265: stdout chunk (state=3): >>># cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib <<< 7491 1727203960.13323: stdout chunk (state=3): >>># cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 7491 1727203960.13332: stdout chunk (state=3): >>># destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 7491 1727203960.13545: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize <<< 7491 1727203960.13588: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath # destroy stat <<< 7491 1727203960.13623: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 7491 1727203960.13652: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 7491 1727203960.13699: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 7491 1727203960.14223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203960.14232: stdout chunk (state=3): >>><<< 7491 1727203960.14248: stderr chunk (state=3): >>><<< 7491 1727203960.14379: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08a43dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f087d83a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08a43b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08a43ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f087d8490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f087d8940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f087d8670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0878f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0878f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f087b2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0878f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f087f0880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08788d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f087b2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f087d8970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0872ef10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f087340a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f087275b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0872f6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0872e3d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f08615eb0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f086159a0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08615fa0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08615df0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08625160> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08709e20> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08701700> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08715760> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08735eb0> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f08625d60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08709340> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f08715370> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0873ba60> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08625f40> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08625e80> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08625df0> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f085f9460> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f085f9550> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f085d70d0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08628b20> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f086284c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0852d2b0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f085e4d60> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08628fa0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0873b0d0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0853dbe0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f0853df10> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08550820> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08550d60> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f084e9490> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0853df40> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f084f9370> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f085506a0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f084f9430> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08625ac0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f08515790> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f08515a60> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08515850> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f08515940> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f08515d90> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f085202e0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f085159d0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08509b20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f086256a0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08515b80> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f9f0843e760> # zipimport: found 103 names in '/tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0837b8b0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f0837b160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0837b280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0837b5e0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0837b4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0837be20> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f0837b580> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0837b100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07dc0040> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f07d093d0> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f07d090d0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07d09d30> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08363d90> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f083633a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08363f40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0843ea90> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07deedc0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07dee490> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08378a90> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f07dee5b0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07dee5e0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f07d74f70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f083c52e0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f07d717f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f083c5460> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f083c5c40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07d71790> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f083c5130> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f083c5670> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f083c5730> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f083bd9a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f07d678e0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f07d85c70> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07d70520> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f07d67e80> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07d70940> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f07d80790> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07dbf850> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07945fa0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07df3310> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08381ca0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f07da3c70> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f08381bb0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07d832b0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f0834fb80> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f077a0160> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f076a2100> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07908a60> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f079089d0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f078dac70> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f078dac10> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07924460> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f079243d0> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f078ea310> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f078ea9a0> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f078ea940> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f077040d0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f083cdc40> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07924790> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f075f4f10> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f075f49d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f07616c10> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07666c40> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f076685e0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f07668790> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_wondcik_/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9f07408d90> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f073da070> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9f073da1c0> import 'gc' # {"ansible_facts": {"ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "52", "second": "40", "epoch": "1727203960", "epoch_int": "1727203960", "date": "2024-09-24", "time": "14:52:40", "iso8601_micro": "2024-09-24T18:52:40.091161Z", "iso8601": "2024-09-24T18:52:40Z", "iso8601_basic": "20240924T145240091161", "iso8601_basic_short": "20240924T145240", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 7491 1727203960.15427: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203959.6181378-7647-66951561288669/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203960.15431: _low_level_execute_command(): starting 7491 1727203960.15433: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203959.6181378-7647-66951561288669/ > /dev/null 2>&1 && sleep 0' 7491 1727203960.16058: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203960.16067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203960.16079: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203960.16092: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203960.16095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203960.16123: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 7491 1727203960.16126: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203960.16129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203960.16174: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203960.16185: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203960.16243: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203960.18708: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203960.18767: stderr chunk (state=3): >>><<< 7491 1727203960.18771: stdout chunk (state=3): >>><<< 7491 1727203960.18786: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203960.18796: handler run complete 7491 1727203960.18843: variable 'ansible_facts' from source: unknown 7491 1727203960.18885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203960.18958: variable 'ansible_facts' from source: unknown 7491 1727203960.18992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203960.19033: attempt loop complete, returning result 7491 1727203960.19036: _execute() done 7491 1727203960.19038: dumping result to json 7491 1727203960.19044: done dumping result, returning 7491 1727203960.19052: done running TaskExecutor() for managed-node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affcd87-79f5-0a4a-ad01-000000000166] 7491 1727203960.19056: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000166 ok: [managed-node3] 7491 1727203960.19304: no more pending results, returning what we have 7491 1727203960.19307: results queue empty 7491 1727203960.19308: checking for any_errors_fatal 7491 1727203960.19310: done checking for any_errors_fatal 7491 1727203960.19310: checking for max_fail_percentage 7491 1727203960.19312: done checking for max_fail_percentage 7491 1727203960.19313: checking to see if all hosts have failed and the running result is not ok 7491 1727203960.19313: done checking to see if all hosts have failed 7491 1727203960.19314: getting the remaining hosts for this loop 7491 1727203960.19315: done getting the remaining hosts for this loop 7491 1727203960.19321: getting the next task for host managed-node3 7491 1727203960.19329: done getting next task for host managed-node3 7491 1727203960.19331: ^ task is: TASK: Check if system is ostree 7491 1727203960.19334: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203960.19337: getting variables 7491 1727203960.19338: in VariableManager get_vars() 7491 1727203960.19389: Calling all_inventory to load vars for managed-node3 7491 1727203960.19391: Calling groups_inventory to load vars for managed-node3 7491 1727203960.19394: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203960.19403: Calling all_plugins_play to load vars for managed-node3 7491 1727203960.19405: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203960.19408: Calling groups_plugins_play to load vars for managed-node3 7491 1727203960.19509: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000166 7491 1727203960.19514: WORKER PROCESS EXITING 7491 1727203960.19525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203960.19641: done with get_vars() 7491 1727203960.19649: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 14:52:40 -0400 (0:00:00.643) 0:00:02.121 ***** 7491 1727203960.19720: entering _queue_task() for managed-node3/stat 7491 1727203960.19908: worker is 1 (out of 1 available) 7491 1727203960.19923: exiting _queue_task() for managed-node3/stat 7491 1727203960.19934: done queuing things up, now waiting for results queue to drain 7491 1727203960.19935: waiting for pending results... 7491 1727203960.20093: running TaskExecutor() for managed-node3/TASK: Check if system is ostree 7491 1727203960.20166: in run() - task 0affcd87-79f5-0a4a-ad01-000000000168 7491 1727203960.20177: variable 'ansible_search_path' from source: unknown 7491 1727203960.20181: variable 'ansible_search_path' from source: unknown 7491 1727203960.20209: calling self._execute() 7491 1727203960.20275: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203960.20278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203960.20285: variable 'omit' from source: magic vars 7491 1727203960.20625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727203960.20824: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727203960.20858: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727203960.20885: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727203960.20914: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727203960.20980: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727203960.20997: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727203960.21019: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203960.21041: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727203960.21134: Evaluated conditional (not __network_is_ostree is defined): True 7491 1727203960.21138: variable 'omit' from source: magic vars 7491 1727203960.21168: variable 'omit' from source: magic vars 7491 1727203960.21195: variable 'omit' from source: magic vars 7491 1727203960.21214: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203960.21240: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203960.21257: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203960.21270: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203960.21278: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203960.21300: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203960.21303: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203960.21306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203960.21378: Set connection var ansible_timeout to 10 7491 1727203960.21383: Set connection var ansible_pipelining to False 7491 1727203960.21388: Set connection var ansible_shell_type to sh 7491 1727203960.21393: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203960.21400: Set connection var ansible_shell_executable to /bin/sh 7491 1727203960.21404: Set connection var ansible_connection to ssh 7491 1727203960.21425: variable 'ansible_shell_executable' from source: unknown 7491 1727203960.21427: variable 'ansible_connection' from source: unknown 7491 1727203960.21430: variable 'ansible_module_compression' from source: unknown 7491 1727203960.21433: variable 'ansible_shell_type' from source: unknown 7491 1727203960.21437: variable 'ansible_shell_executable' from source: unknown 7491 1727203960.21439: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203960.21441: variable 'ansible_pipelining' from source: unknown 7491 1727203960.21443: variable 'ansible_timeout' from source: unknown 7491 1727203960.21447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203960.21544: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7491 1727203960.21552: variable 'omit' from source: magic vars 7491 1727203960.21565: starting attempt loop 7491 1727203960.21571: running the handler 7491 1727203960.21574: _low_level_execute_command(): starting 7491 1727203960.21581: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203960.22121: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203960.22125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203960.22149: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203960.22152: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203960.22155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203960.22201: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203960.22213: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203960.22276: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203960.24462: stdout chunk (state=3): >>>/root <<< 7491 1727203960.24771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203960.24775: stdout chunk (state=3): >>><<< 7491 1727203960.24778: stderr chunk (state=3): >>><<< 7491 1727203960.24782: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203960.24792: _low_level_execute_command(): starting 7491 1727203960.24796: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203960.2472603-7676-280148452040430 `" && echo ansible-tmp-1727203960.2472603-7676-280148452040430="` echo /root/.ansible/tmp/ansible-tmp-1727203960.2472603-7676-280148452040430 `" ) && sleep 0' 7491 1727203960.25420: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203960.25424: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203960.25439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203960.25447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203960.25499: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203960.25505: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203960.25519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203960.25528: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203960.25536: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203960.25543: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203960.25550: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203960.25559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203960.25573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203960.25576: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203960.25593: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203960.25610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203960.25688: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203960.25691: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203960.25705: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203960.25785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 3 <<< 7491 1727203960.28338: stdout chunk (state=3): >>>ansible-tmp-1727203960.2472603-7676-280148452040430=/root/.ansible/tmp/ansible-tmp-1727203960.2472603-7676-280148452040430 <<< 7491 1727203960.28489: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203960.28493: stderr chunk (state=3): >>><<< 7491 1727203960.28495: stdout chunk (state=3): >>><<< 7491 1727203960.28522: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203960.2472603-7676-280148452040430=/root/.ansible/tmp/ansible-tmp-1727203960.2472603-7676-280148452040430 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 3 debug2: Received exit status from master 0 7491 1727203960.28580: variable 'ansible_module_compression' from source: unknown 7491 1727203960.28641: ANSIBALLZ: Using lock for stat 7491 1727203960.28644: ANSIBALLZ: Acquiring lock 7491 1727203960.28646: ANSIBALLZ: Lock acquired: 139674606107008 7491 1727203960.28649: ANSIBALLZ: Creating module 7491 1727203960.57896: ANSIBALLZ: Writing module into payload 7491 1727203960.58028: ANSIBALLZ: Writing module 7491 1727203960.58125: ANSIBALLZ: Renaming module 7491 1727203960.58136: ANSIBALLZ: Done creating module 7491 1727203960.58159: variable 'ansible_facts' from source: unknown 7491 1727203960.58281: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203960.2472603-7676-280148452040430/AnsiballZ_stat.py 7491 1727203960.59160: Sending initial data 7491 1727203960.59177: Sent initial data (151 bytes) 7491 1727203960.61947: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203960.61951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203960.61963: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203960.62099: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203960.62115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203960.62134: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203960.62146: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203960.62156: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203960.62169: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203960.62185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203960.62208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203960.62219: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203960.62230: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203960.62243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203960.62433: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203960.62457: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203960.62476: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203960.62551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203960.64314: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203960.64350: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203960.64400: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmplpfvt3o4 /root/.ansible/tmp/ansible-tmp-1727203960.2472603-7676-280148452040430/AnsiballZ_stat.py <<< 7491 1727203960.64438: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203960.65696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203960.65770: stderr chunk (state=3): >>><<< 7491 1727203960.65773: stdout chunk (state=3): >>><<< 7491 1727203960.65891: done transferring module to remote 7491 1727203960.65894: _low_level_execute_command(): starting 7491 1727203960.65897: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203960.2472603-7676-280148452040430/ /root/.ansible/tmp/ansible-tmp-1727203960.2472603-7676-280148452040430/AnsiballZ_stat.py && sleep 0' 7491 1727203960.67153: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203960.67157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203960.67289: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203960.67292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203960.67412: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 7491 1727203960.67418: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203960.67482: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203960.67486: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203960.67548: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203960.69283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203960.69337: stderr chunk (state=3): >>><<< 7491 1727203960.69341: stdout chunk (state=3): >>><<< 7491 1727203960.69358: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203960.69361: _low_level_execute_command(): starting 7491 1727203960.69370: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203960.2472603-7676-280148452040430/AnsiballZ_stat.py && sleep 0' 7491 1727203960.70011: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203960.70021: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203960.70030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203960.70043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203960.70083: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203960.70091: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203960.70099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203960.70112: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203960.70121: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203960.70126: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203960.70133: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203960.70142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203960.70153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203960.70160: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203960.70167: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203960.70178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203960.70347: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203960.70352: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203960.70354: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203960.70418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203960.72278: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 7491 1727203960.72326: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 7491 1727203960.72354: stdout chunk (state=3): >>>import 'posix' # <<< 7491 1727203960.72389: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 7491 1727203960.72429: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # <<< 7491 1727203960.72434: stdout chunk (state=3): >>># installed zipimport hook <<< 7491 1727203960.72515: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 7491 1727203960.72524: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 7491 1727203960.72547: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84e43dc0> <<< 7491 1727203960.72593: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 7491 1727203960.72604: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84bd83a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84e43b20> <<< 7491 1727203960.72643: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 7491 1727203960.72655: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84e43ac0> <<< 7491 1727203960.72662: stdout chunk (state=3): >>>import '_signal' # <<< 7491 1727203960.72691: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84bd8490> <<< 7491 1727203960.72761: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 7491 1727203960.72772: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84bd8940> <<< 7491 1727203960.72784: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84bd8670> <<< 7491 1727203960.72816: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 7491 1727203960.72832: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 7491 1727203960.72848: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 7491 1727203960.72879: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 7491 1727203960.72887: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 7491 1727203960.72903: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 7491 1727203960.72926: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b8f190> <<< 7491 1727203960.72950: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 7491 1727203960.72958: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 7491 1727203960.73033: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b8f220> <<< 7491 1727203960.73062: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 7491 1727203960.73097: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84bb2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b8f940> <<< 7491 1727203960.73135: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84bf0880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py <<< 7491 1727203960.73151: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b88d90> <<< 7491 1727203960.73226: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py <<< 7491 1727203960.73237: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84bb2d90> <<< 7491 1727203960.73541: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84bd8970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 7491 1727203960.73657: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 7491 1727203960.73702: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 7491 1727203960.73739: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 7491 1727203960.73772: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 7491 1727203960.73805: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 7491 1727203960.73891: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b2ef10> <<< 7491 1727203960.74042: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b340a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 7491 1727203960.74067: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py<<< 7491 1727203960.74085: stdout chunk (state=3): >>> <<< 7491 1727203960.74101: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc'<<< 7491 1727203960.74105: stdout chunk (state=3): >>> <<< 7491 1727203960.74140: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b275b0><<< 7491 1727203960.74145: stdout chunk (state=3): >>> <<< 7491 1727203960.74183: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b2f6a0> <<< 7491 1727203960.74214: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b2e3d0><<< 7491 1727203960.74220: stdout chunk (state=3): >>> <<< 7491 1727203960.74247: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py<<< 7491 1727203960.74252: stdout chunk (state=3): >>> <<< 7491 1727203960.74353: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc'<<< 7491 1727203960.74358: stdout chunk (state=3): >>> <<< 7491 1727203960.74401: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py<<< 7491 1727203960.74406: stdout chunk (state=3): >>> <<< 7491 1727203960.74456: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc'<<< 7491 1727203960.74461: stdout chunk (state=3): >>> <<< 7491 1727203960.74498: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py<<< 7491 1727203960.74502: stdout chunk (state=3): >>> <<< 7491 1727203960.74524: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc'<<< 7491 1727203960.74533: stdout chunk (state=3): >>> <<< 7491 1727203960.74574: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so'<<< 7491 1727203960.74585: stdout chunk (state=3): >>> <<< 7491 1727203960.74601: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so'<<< 7491 1727203960.74604: stdout chunk (state=3): >>> import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c84ab1eb0><<< 7491 1727203960.74614: stdout chunk (state=3): >>> <<< 7491 1727203960.74625: stdout chunk (state=3): >>>import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84ab19a0><<< 7491 1727203960.74630: stdout chunk (state=3): >>> <<< 7491 1727203960.74662: stdout chunk (state=3): >>>import 'itertools' # <<< 7491 1727203960.74668: stdout chunk (state=3): >>> <<< 7491 1727203960.74706: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py<<< 7491 1727203960.74734: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' <<< 7491 1727203960.74746: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84ab1fa0> <<< 7491 1727203960.74790: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py<<< 7491 1727203960.74795: stdout chunk (state=3): >>> <<< 7491 1727203960.74821: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc'<<< 7491 1727203960.74828: stdout chunk (state=3): >>> <<< 7491 1727203960.74860: stdout chunk (state=3): >>>import '_operator' # <<< 7491 1727203960.74875: stdout chunk (state=3): >>> <<< 7491 1727203960.74884: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84ab1df0><<< 7491 1727203960.74891: stdout chunk (state=3): >>> <<< 7491 1727203960.74921: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py<<< 7491 1727203960.74935: stdout chunk (state=3): >>> <<< 7491 1727203960.74947: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' <<< 7491 1727203960.74971: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84ac1160><<< 7491 1727203960.74983: stdout chunk (state=3): >>> <<< 7491 1727203960.75001: stdout chunk (state=3): >>>import '_collections' # <<< 7491 1727203960.75007: stdout chunk (state=3): >>> <<< 7491 1727203960.75071: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b09e20><<< 7491 1727203960.75084: stdout chunk (state=3): >>> <<< 7491 1727203960.75099: stdout chunk (state=3): >>>import '_functools' # <<< 7491 1727203960.75107: stdout chunk (state=3): >>> <<< 7491 1727203960.75150: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b01700><<< 7491 1727203960.75154: stdout chunk (state=3): >>> <<< 7491 1727203960.75234: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py<<< 7491 1727203960.75245: stdout chunk (state=3): >>> <<< 7491 1727203960.75259: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' <<< 7491 1727203960.75277: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b15760><<< 7491 1727203960.75291: stdout chunk (state=3): >>> <<< 7491 1727203960.75301: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b35eb0><<< 7491 1727203960.75311: stdout chunk (state=3): >>> <<< 7491 1727203960.75340: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py<<< 7491 1727203960.75356: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc'<<< 7491 1727203960.75361: stdout chunk (state=3): >>> <<< 7491 1727203960.75405: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so'<<< 7491 1727203960.75430: stdout chunk (state=3): >>> # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so'<<< 7491 1727203960.75441: stdout chunk (state=3): >>> import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c84ac1d60> <<< 7491 1727203960.75474: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b09340> <<< 7491 1727203960.75525: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 7491 1727203960.75574: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c84b15370> <<< 7491 1727203960.75592: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b3ba60> <<< 7491 1727203960.75636: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py <<< 7491 1727203960.75655: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 7491 1727203960.75702: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py <<< 7491 1727203960.75731: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 7491 1727203960.75767: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc'<<< 7491 1727203960.75789: stdout chunk (state=3): >>> <<< 7491 1727203960.75808: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84ac1f40> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84ac1e80><<< 7491 1727203960.75829: stdout chunk (state=3): >>> <<< 7491 1727203960.75878: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84ac1df0> <<< 7491 1727203960.75918: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 7491 1727203960.75934: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 7491 1727203960.75976: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 7491 1727203960.75990: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 7491 1727203960.76019: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 7491 1727203960.76094: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py <<< 7491 1727203960.76106: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84a95460> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 7491 1727203960.76857: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84a95550> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84a730d0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84ac4b20> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84ac44c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c849c92b0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84a80d60> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84ac4fa0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b3b0d0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c849d9be0> <<< 7491 1727203960.77041: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c849d9f10> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c849ec820> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 7491 1727203960.77101: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c849ecd60> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c8497a490> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c849d9f40> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c8498a370> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c849ec6a0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' <<< 7491 1727203960.77233: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c8498a430> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84ac1ac0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c849a6790> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 7491 1727203960.77236: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' <<< 7491 1727203960.77238: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c849a6a60> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c849a6850> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c849a6940> <<< 7491 1727203960.77289: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 7491 1727203960.77503: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c849a6d90> <<< 7491 1727203960.77541: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c849b02e0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c849a69d0> <<< 7491 1727203960.77552: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c8499ab20> <<< 7491 1727203960.77576: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84ac16a0> <<< 7491 1727203960.77597: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 7491 1727203960.77646: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 7491 1727203960.77681: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c849a6b80> <<< 7491 1727203960.77778: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 7491 1727203960.77790: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f2c848c9760> <<< 7491 1727203960.77915: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip' # zipimport: zlib available <<< 7491 1727203960.78018: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.78048: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available <<< 7491 1727203960.78079: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/__init__.py <<< 7491 1727203960.78089: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.79743: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.81256: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847f08b0> <<< 7491 1727203960.81298: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 7491 1727203960.81328: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 7491 1727203960.81332: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 7491 1727203960.81367: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c847f0160> <<< 7491 1727203960.81380: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847f0280> <<< 7491 1727203960.81540: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847f05e0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847f04f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847f0e20> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c847f0580> <<< 7491 1727203960.81546: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 7491 1727203960.81611: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847f0100> <<< 7491 1727203960.81643: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 7491 1727203960.81647: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 7491 1727203960.81683: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 7491 1727203960.81686: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 7491 1727203960.81741: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84747fd0> <<< 7491 1727203960.81796: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c84765c40> <<< 7491 1727203960.81848: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c84765f40> <<< 7491 1727203960.81851: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 7491 1727203960.81883: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 7491 1727203960.81899: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847652e0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84858d90> <<< 7491 1727203960.82082: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c848583a0> <<< 7491 1727203960.82111: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84858f40> <<< 7491 1727203960.82158: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 7491 1727203960.82174: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 7491 1727203960.82218: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 7491 1727203960.82221: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c848c9a90> <<< 7491 1727203960.82321: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847c3dc0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847c3490> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847fa580> <<< 7491 1727203960.82370: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c847c35b0> <<< 7491 1727203960.82374: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847c35e0> <<< 7491 1727203960.82420: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 7491 1727203960.82425: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 7491 1727203960.82438: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 7491 1727203960.82503: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c84738f70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c848382e0> <<< 7491 1727203960.82535: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 7491 1727203960.82596: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 7491 1727203960.82599: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c847357f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84838460> <<< 7491 1727203960.82611: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 7491 1727203960.82661: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 7491 1727203960.82676: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 7491 1727203960.82724: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84850f40> <<< 7491 1727203960.82845: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84735790> <<< 7491 1727203960.82935: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c847355e0> <<< 7491 1727203960.82961: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c84734550> <<< 7491 1727203960.83038: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c84734490> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c8482f9a0> <<< 7491 1727203960.83054: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 7491 1727203960.83057: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 7491 1727203960.83060: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 7491 1727203960.83099: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' <<< 7491 1727203960.83116: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c847b96a0> <<< 7491 1727203960.83561: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c847b8bb0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847c90d0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c847b9100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847fcc40> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available <<< 7491 1727203960.83704: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available <<< 7491 1727203960.84787: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 7491 1727203960.85569: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py <<< 7491 1727203960.85623: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 7491 1727203960.85649: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py <<< 7491 1727203960.85690: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 7491 1727203960.85729: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 7491 1727203960.85852: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c841e6940> <<< 7491 1727203960.85983: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py <<< 7491 1727203960.85987: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 7491 1727203960.86030: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847b6d30> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847ad7c0> <<< 7491 1727203960.86095: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py <<< 7491 1727203960.86113: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.86147: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.86184: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/_text.py <<< 7491 1727203960.86214: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.86419: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.86638: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py<<< 7491 1727203960.86655: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc'<<< 7491 1727203960.86657: stdout chunk (state=3): >>> <<< 7491 1727203960.86700: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847b84c0><<< 7491 1727203960.86702: stdout chunk (state=3): >>> <<< 7491 1727203960.86729: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.87391: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.88040: stdout chunk (state=3): >>># zipimport: zlib available<<< 7491 1727203960.88044: stdout chunk (state=3): >>> <<< 7491 1727203960.88130: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.88239: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/common/collections.py <<< 7491 1727203960.88276: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.88331: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.88423: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py <<< 7491 1727203960.88426: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.88519: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.88652: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/errors.py<<< 7491 1727203960.88655: stdout chunk (state=3): >>> <<< 7491 1727203960.88680: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.88712: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.88754: stdout chunk (state=3): >>>import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py <<< 7491 1727203960.88757: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.88814: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.88872: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 7491 1727203960.88899: stdout chunk (state=3): >>># zipimport: zlib available<<< 7491 1727203960.88905: stdout chunk (state=3): >>> <<< 7491 1727203960.89220: stdout chunk (state=3): >>># zipimport: zlib available<<< 7491 1727203960.89233: stdout chunk (state=3): >>> <<< 7491 1727203960.89538: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 7491 1727203960.89598: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 7491 1727203960.89628: stdout chunk (state=3): >>>import '_ast' # <<< 7491 1727203960.89631: stdout chunk (state=3): >>> <<< 7491 1727203960.89737: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c83d73940> <<< 7491 1727203960.89767: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.89886: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.90002: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py <<< 7491 1727203960.90006: stdout chunk (state=3): >>>import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py<<< 7491 1727203960.90037: stdout chunk (state=3): >>> import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py <<< 7491 1727203960.90062: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.90103: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.90145: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/common/locale.py <<< 7491 1727203960.90151: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.90198: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.90250: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.90372: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.90454: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 7491 1727203960.90494: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 7491 1727203960.90590: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c84843b50> <<< 7491 1727203960.90636: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c83d72070> <<< 7491 1727203960.90691: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/common/file.py <<< 7491 1727203960.90695: stdout chunk (state=3): >>>import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 7491 1727203960.90883: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.90959: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.90995: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.91048: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 7491 1727203960.91062: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 7491 1727203960.91088: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 7491 1727203960.91135: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 7491 1727203960.91158: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 7491 1727203960.91182: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 7491 1727203960.91327: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c83dc36d0> <<< 7491 1727203960.91387: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c841ddc10> <<< 7491 1727203960.91471: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c841dc5b0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py <<< 7491 1727203960.91479: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.91513: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.91541: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py <<< 7491 1727203960.91638: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/basic.py <<< 7491 1727203960.91649: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.91668: stdout chunk (state=3): >>># zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/modules/__init__.py <<< 7491 1727203960.91701: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.91871: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.92138: stdout chunk (state=3): >>># zipimport: zlib available <<< 7491 1727203960.92358: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 7491 1727203960.92718: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv <<< 7491 1727203960.92754: stdout chunk (state=3): >>># clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io <<< 7491 1727203960.92802: stdout chunk (state=3): >>># cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse <<< 7491 1727203960.92808: stdout chunk (state=3): >>># cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib <<< 7491 1727203960.92811: stdout chunk (state=3): >>># cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 <<< 7491 1727203960.92834: stdout chunk (state=3): >>># cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible <<< 7491 1727203960.92839: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize <<< 7491 1727203960.92842: stdout chunk (state=3): >>># cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat <<< 7491 1727203960.92847: stdout chunk (state=3): >>># destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool <<< 7491 1727203960.92850: stdout chunk (state=3): >>># destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 <<< 7491 1727203960.92860: stdout chunk (state=3): >>># cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 7491 1727203960.93098: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 7491 1727203960.93187: stdout chunk (state=3): >>># destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma <<< 7491 1727203960.93220: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid <<< 7491 1727203960.93253: stdout chunk (state=3): >>># destroy array # destroy datetime <<< 7491 1727203960.93270: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse <<< 7491 1727203960.93341: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform <<< 7491 1727203960.93432: stdout chunk (state=3): >>># destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 7491 1727203960.93508: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp <<< 7491 1727203960.93540: stdout chunk (state=3): >>># cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 7491 1727203960.93742: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize <<< 7491 1727203960.93833: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 7491 1727203960.93868: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks <<< 7491 1727203960.94326: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203960.94339: stdout chunk (state=3): >>><<< 7491 1727203960.94351: stderr chunk (state=3): >>><<< 7491 1727203960.94414: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84e43dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84bd83a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84e43b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84e43ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84bd8490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84bd8940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84bd8670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b8f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b8f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84bb2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b8f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84bf0880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b88d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84bb2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84bd8970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b2ef10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b340a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b275b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b2f6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b2e3d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c84ab1eb0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84ab19a0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84ab1fa0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84ab1df0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84ac1160> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b09e20> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b01700> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b15760> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b35eb0> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c84ac1d60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b09340> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c84b15370> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b3ba60> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84ac1f40> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84ac1e80> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84ac1df0> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84a95460> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84a95550> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84a730d0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84ac4b20> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84ac44c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c849c92b0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84a80d60> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84ac4fa0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84b3b0d0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c849d9be0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c849d9f10> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c849ec820> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c849ecd60> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c8497a490> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c849d9f40> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c8498a370> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c849ec6a0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c8498a430> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84ac1ac0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c849a6790> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c849a6a60> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c849a6850> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c849a6940> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c849a6d90> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c849b02e0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c849a69d0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c8499ab20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84ac16a0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c849a6b80> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f2c848c9760> # zipimport: found 30 names in '/tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847f08b0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c847f0160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847f0280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847f05e0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847f04f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847f0e20> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c847f0580> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847f0100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84747fd0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c84765c40> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c84765f40> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847652e0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84858d90> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c848583a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84858f40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c848c9a90> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847c3dc0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847c3490> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847fa580> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c847c35b0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847c35e0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c84738f70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c848382e0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c847357f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84838460> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84850f40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c84735790> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c847355e0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c84734550> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c84734490> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c8482f9a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c847b96a0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c847b8bb0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847c90d0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c847b9100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847fcc40> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c841e6940> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847b6d30> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847ad7c0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c847b84c0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c83d73940> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c84843b50> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c83d72070> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c83dc36d0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c841ddc10> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c841dc5b0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_kt7onc1a/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 7491 1727203960.94954: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203960.2472603-7676-280148452040430/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203960.94958: _low_level_execute_command(): starting 7491 1727203960.94960: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203960.2472603-7676-280148452040430/ > /dev/null 2>&1 && sleep 0' 7491 1727203960.95128: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203960.95131: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203960.95134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203960.95177: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203960.95181: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203960.95183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203960.95185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203960.95245: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203960.95248: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203960.95250: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203960.95300: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203960.97766: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203960.97844: stderr chunk (state=3): >>><<< 7491 1727203960.97847: stdout chunk (state=3): >>><<< 7491 1727203960.97870: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203960.98070: handler run complete 7491 1727203960.98074: attempt loop complete, returning result 7491 1727203960.98076: _execute() done 7491 1727203960.98078: dumping result to json 7491 1727203960.98081: done dumping result, returning 7491 1727203960.98083: done running TaskExecutor() for managed-node3/TASK: Check if system is ostree [0affcd87-79f5-0a4a-ad01-000000000168] 7491 1727203960.98085: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000168 7491 1727203960.98157: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000168 7491 1727203960.98161: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 7491 1727203960.98221: no more pending results, returning what we have 7491 1727203960.98224: results queue empty 7491 1727203960.98225: checking for any_errors_fatal 7491 1727203960.98231: done checking for any_errors_fatal 7491 1727203960.98232: checking for max_fail_percentage 7491 1727203960.98233: done checking for max_fail_percentage 7491 1727203960.98234: checking to see if all hosts have failed and the running result is not ok 7491 1727203960.98235: done checking to see if all hosts have failed 7491 1727203960.98236: getting the remaining hosts for this loop 7491 1727203960.98237: done getting the remaining hosts for this loop 7491 1727203960.98241: getting the next task for host managed-node3 7491 1727203960.98246: done getting next task for host managed-node3 7491 1727203960.98248: ^ task is: TASK: Set flag to indicate system is ostree 7491 1727203960.98250: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203960.98255: getting variables 7491 1727203960.98257: in VariableManager get_vars() 7491 1727203960.98288: Calling all_inventory to load vars for managed-node3 7491 1727203960.98290: Calling groups_inventory to load vars for managed-node3 7491 1727203960.98294: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203960.98304: Calling all_plugins_play to load vars for managed-node3 7491 1727203960.98306: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203960.98309: Calling groups_plugins_play to load vars for managed-node3 7491 1727203960.98469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203960.98668: done with get_vars() 7491 1727203960.98688: done getting variables 7491 1727203960.98792: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 14:52:40 -0400 (0:00:00.790) 0:00:02.912 ***** 7491 1727203960.98823: entering _queue_task() for managed-node3/set_fact 7491 1727203960.98825: Creating lock for set_fact 7491 1727203960.99092: worker is 1 (out of 1 available) 7491 1727203960.99104: exiting _queue_task() for managed-node3/set_fact 7491 1727203960.99125: done queuing things up, now waiting for results queue to drain 7491 1727203960.99127: waiting for pending results... 7491 1727203960.99385: running TaskExecutor() for managed-node3/TASK: Set flag to indicate system is ostree 7491 1727203960.99480: in run() - task 0affcd87-79f5-0a4a-ad01-000000000169 7491 1727203960.99491: variable 'ansible_search_path' from source: unknown 7491 1727203960.99495: variable 'ansible_search_path' from source: unknown 7491 1727203960.99526: calling self._execute() 7491 1727203960.99612: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203960.99618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203960.99625: variable 'omit' from source: magic vars 7491 1727203961.00160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727203961.00410: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727203961.00467: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727203961.00499: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727203961.00551: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727203961.00635: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727203961.00679: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727203961.00711: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203961.00742: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727203961.00885: Evaluated conditional (not __network_is_ostree is defined): True 7491 1727203961.00896: variable 'omit' from source: magic vars 7491 1727203961.00935: variable 'omit' from source: magic vars 7491 1727203961.01061: variable '__ostree_booted_stat' from source: set_fact 7491 1727203961.01126: variable 'omit' from source: magic vars 7491 1727203961.01155: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203961.01198: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203961.01226: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203961.01248: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203961.01262: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203961.01302: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203961.01317: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203961.01326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203961.01439: Set connection var ansible_timeout to 10 7491 1727203961.01451: Set connection var ansible_pipelining to False 7491 1727203961.01460: Set connection var ansible_shell_type to sh 7491 1727203961.01472: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203961.01483: Set connection var ansible_shell_executable to /bin/sh 7491 1727203961.01491: Set connection var ansible_connection to ssh 7491 1727203961.01524: variable 'ansible_shell_executable' from source: unknown 7491 1727203961.01538: variable 'ansible_connection' from source: unknown 7491 1727203961.01545: variable 'ansible_module_compression' from source: unknown 7491 1727203961.01551: variable 'ansible_shell_type' from source: unknown 7491 1727203961.01558: variable 'ansible_shell_executable' from source: unknown 7491 1727203961.01567: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203961.01576: variable 'ansible_pipelining' from source: unknown 7491 1727203961.01582: variable 'ansible_timeout' from source: unknown 7491 1727203961.01590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203961.01707: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203961.01722: variable 'omit' from source: magic vars 7491 1727203961.01738: starting attempt loop 7491 1727203961.01752: running the handler 7491 1727203961.01776: handler run complete 7491 1727203961.01790: attempt loop complete, returning result 7491 1727203961.01797: _execute() done 7491 1727203961.01804: dumping result to json 7491 1727203961.01810: done dumping result, returning 7491 1727203961.01820: done running TaskExecutor() for managed-node3/TASK: Set flag to indicate system is ostree [0affcd87-79f5-0a4a-ad01-000000000169] 7491 1727203961.01830: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000169 7491 1727203961.01940: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000169 7491 1727203961.01955: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 7491 1727203961.02019: no more pending results, returning what we have 7491 1727203961.02022: results queue empty 7491 1727203961.02023: checking for any_errors_fatal 7491 1727203961.02029: done checking for any_errors_fatal 7491 1727203961.02030: checking for max_fail_percentage 7491 1727203961.02032: done checking for max_fail_percentage 7491 1727203961.02032: checking to see if all hosts have failed and the running result is not ok 7491 1727203961.02033: done checking to see if all hosts have failed 7491 1727203961.02034: getting the remaining hosts for this loop 7491 1727203961.02036: done getting the remaining hosts for this loop 7491 1727203961.02040: getting the next task for host managed-node3 7491 1727203961.02050: done getting next task for host managed-node3 7491 1727203961.02055: ^ task is: TASK: Fix CentOS6 Base repo 7491 1727203961.02058: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203961.02062: getting variables 7491 1727203961.02065: in VariableManager get_vars() 7491 1727203961.02095: Calling all_inventory to load vars for managed-node3 7491 1727203961.02099: Calling groups_inventory to load vars for managed-node3 7491 1727203961.02103: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203961.02114: Calling all_plugins_play to load vars for managed-node3 7491 1727203961.02117: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203961.02128: Calling groups_plugins_play to load vars for managed-node3 7491 1727203961.02343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203961.02555: done with get_vars() 7491 1727203961.02568: done getting variables 7491 1727203961.02774: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 14:52:41 -0400 (0:00:00.039) 0:00:02.951 ***** 7491 1727203961.02808: entering _queue_task() for managed-node3/copy 7491 1727203961.03239: worker is 1 (out of 1 available) 7491 1727203961.03261: exiting _queue_task() for managed-node3/copy 7491 1727203961.03275: done queuing things up, now waiting for results queue to drain 7491 1727203961.03276: waiting for pending results... 7491 1727203961.03538: running TaskExecutor() for managed-node3/TASK: Fix CentOS6 Base repo 7491 1727203961.03608: in run() - task 0affcd87-79f5-0a4a-ad01-00000000016b 7491 1727203961.03629: variable 'ansible_search_path' from source: unknown 7491 1727203961.03632: variable 'ansible_search_path' from source: unknown 7491 1727203961.03659: calling self._execute() 7491 1727203961.03718: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203961.03725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203961.03733: variable 'omit' from source: magic vars 7491 1727203961.04074: variable 'ansible_distribution' from source: facts 7491 1727203961.04090: Evaluated conditional (ansible_distribution == 'CentOS'): True 7491 1727203961.04177: variable 'ansible_distribution_major_version' from source: facts 7491 1727203961.04183: Evaluated conditional (ansible_distribution_major_version == '6'): False 7491 1727203961.04186: when evaluation is False, skipping this task 7491 1727203961.04189: _execute() done 7491 1727203961.04191: dumping result to json 7491 1727203961.04193: done dumping result, returning 7491 1727203961.04200: done running TaskExecutor() for managed-node3/TASK: Fix CentOS6 Base repo [0affcd87-79f5-0a4a-ad01-00000000016b] 7491 1727203961.04206: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000016b 7491 1727203961.04301: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000016b 7491 1727203961.04304: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 7491 1727203961.04366: no more pending results, returning what we have 7491 1727203961.04369: results queue empty 7491 1727203961.04370: checking for any_errors_fatal 7491 1727203961.04375: done checking for any_errors_fatal 7491 1727203961.04375: checking for max_fail_percentage 7491 1727203961.04377: done checking for max_fail_percentage 7491 1727203961.04378: checking to see if all hosts have failed and the running result is not ok 7491 1727203961.04378: done checking to see if all hosts have failed 7491 1727203961.04379: getting the remaining hosts for this loop 7491 1727203961.04381: done getting the remaining hosts for this loop 7491 1727203961.04384: getting the next task for host managed-node3 7491 1727203961.04389: done getting next task for host managed-node3 7491 1727203961.04392: ^ task is: TASK: Include the task 'enable_epel.yml' 7491 1727203961.04395: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203961.04398: getting variables 7491 1727203961.04399: in VariableManager get_vars() 7491 1727203961.04423: Calling all_inventory to load vars for managed-node3 7491 1727203961.04425: Calling groups_inventory to load vars for managed-node3 7491 1727203961.04428: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203961.04437: Calling all_plugins_play to load vars for managed-node3 7491 1727203961.04440: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203961.04443: Calling groups_plugins_play to load vars for managed-node3 7491 1727203961.04549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203961.04687: done with get_vars() 7491 1727203961.04694: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 14:52:41 -0400 (0:00:00.019) 0:00:02.971 ***** 7491 1727203961.04757: entering _queue_task() for managed-node3/include_tasks 7491 1727203961.04942: worker is 1 (out of 1 available) 7491 1727203961.04954: exiting _queue_task() for managed-node3/include_tasks 7491 1727203961.04969: done queuing things up, now waiting for results queue to drain 7491 1727203961.04971: waiting for pending results... 7491 1727203961.05146: running TaskExecutor() for managed-node3/TASK: Include the task 'enable_epel.yml' 7491 1727203961.05265: in run() - task 0affcd87-79f5-0a4a-ad01-00000000016c 7491 1727203961.05284: variable 'ansible_search_path' from source: unknown 7491 1727203961.05291: variable 'ansible_search_path' from source: unknown 7491 1727203961.05341: calling self._execute() 7491 1727203961.05428: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203961.05440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203961.05454: variable 'omit' from source: magic vars 7491 1727203961.05982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203961.07886: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203961.07960: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203961.08006: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203961.08050: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203961.08084: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203961.08171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203961.08206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203961.08239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203961.08291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203961.08311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203961.08472: variable '__network_is_ostree' from source: set_fact 7491 1727203961.09019: Evaluated conditional (not __network_is_ostree | d(false)): True 7491 1727203961.09038: _execute() done 7491 1727203961.09069: dumping result to json 7491 1727203961.09089: done dumping result, returning 7491 1727203961.09099: done running TaskExecutor() for managed-node3/TASK: Include the task 'enable_epel.yml' [0affcd87-79f5-0a4a-ad01-00000000016c] 7491 1727203961.09109: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000016c 7491 1727203961.09251: no more pending results, returning what we have 7491 1727203961.09257: in VariableManager get_vars() 7491 1727203961.09299: Calling all_inventory to load vars for managed-node3 7491 1727203961.09302: Calling groups_inventory to load vars for managed-node3 7491 1727203961.09306: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203961.09318: Calling all_plugins_play to load vars for managed-node3 7491 1727203961.09321: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203961.09325: Calling groups_plugins_play to load vars for managed-node3 7491 1727203961.09506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203961.09719: done with get_vars() 7491 1727203961.09729: variable 'ansible_search_path' from source: unknown 7491 1727203961.09731: variable 'ansible_search_path' from source: unknown 7491 1727203961.09776: we have included files to process 7491 1727203961.09778: generating all_blocks data 7491 1727203961.09780: done generating all_blocks data 7491 1727203961.09798: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 7491 1727203961.09799: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 7491 1727203961.09803: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 7491 1727203961.10419: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000016c 7491 1727203961.10423: WORKER PROCESS EXITING 7491 1727203961.10793: done processing included file 7491 1727203961.10796: iterating over new_blocks loaded from include file 7491 1727203961.10797: in VariableManager get_vars() 7491 1727203961.10806: done with get_vars() 7491 1727203961.10807: filtering new block on tags 7491 1727203961.10824: done filtering new block on tags 7491 1727203961.10826: in VariableManager get_vars() 7491 1727203961.10843: done with get_vars() 7491 1727203961.10845: filtering new block on tags 7491 1727203961.10861: done filtering new block on tags 7491 1727203961.10863: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node3 7491 1727203961.10870: extending task lists for all hosts with included blocks 7491 1727203961.10967: done extending task lists 7491 1727203961.10968: done processing included files 7491 1727203961.10969: results queue empty 7491 1727203961.10970: checking for any_errors_fatal 7491 1727203961.10973: done checking for any_errors_fatal 7491 1727203961.10974: checking for max_fail_percentage 7491 1727203961.10975: done checking for max_fail_percentage 7491 1727203961.10976: checking to see if all hosts have failed and the running result is not ok 7491 1727203961.10977: done checking to see if all hosts have failed 7491 1727203961.10977: getting the remaining hosts for this loop 7491 1727203961.10979: done getting the remaining hosts for this loop 7491 1727203961.10981: getting the next task for host managed-node3 7491 1727203961.10985: done getting next task for host managed-node3 7491 1727203961.10987: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 7491 1727203961.10990: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203961.10992: getting variables 7491 1727203961.10993: in VariableManager get_vars() 7491 1727203961.11001: Calling all_inventory to load vars for managed-node3 7491 1727203961.11003: Calling groups_inventory to load vars for managed-node3 7491 1727203961.11005: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203961.11010: Calling all_plugins_play to load vars for managed-node3 7491 1727203961.11020: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203961.11023: Calling groups_plugins_play to load vars for managed-node3 7491 1727203961.11155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203961.11353: done with get_vars() 7491 1727203961.11361: done getting variables 7491 1727203961.11431: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 7491 1727203961.11634: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 9] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 14:52:41 -0400 (0:00:00.069) 0:00:03.040 ***** 7491 1727203961.11682: entering _queue_task() for managed-node3/command 7491 1727203961.11683: Creating lock for command 7491 1727203961.11952: worker is 1 (out of 1 available) 7491 1727203961.11966: exiting _queue_task() for managed-node3/command 7491 1727203961.11979: done queuing things up, now waiting for results queue to drain 7491 1727203961.11980: waiting for pending results... 7491 1727203961.12232: running TaskExecutor() for managed-node3/TASK: Create EPEL 9 7491 1727203961.12414: in run() - task 0affcd87-79f5-0a4a-ad01-000000000186 7491 1727203961.12441: variable 'ansible_search_path' from source: unknown 7491 1727203961.12448: variable 'ansible_search_path' from source: unknown 7491 1727203961.12486: calling self._execute() 7491 1727203961.12573: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203961.12590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203961.12603: variable 'omit' from source: magic vars 7491 1727203961.13008: variable 'ansible_distribution' from source: facts 7491 1727203961.13028: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 7491 1727203961.13172: variable 'ansible_distribution_major_version' from source: facts 7491 1727203961.13187: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 7491 1727203961.13194: when evaluation is False, skipping this task 7491 1727203961.13200: _execute() done 7491 1727203961.13207: dumping result to json 7491 1727203961.13213: done dumping result, returning 7491 1727203961.13225: done running TaskExecutor() for managed-node3/TASK: Create EPEL 9 [0affcd87-79f5-0a4a-ad01-000000000186] 7491 1727203961.13235: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000186 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 7491 1727203961.13414: no more pending results, returning what we have 7491 1727203961.13420: results queue empty 7491 1727203961.13421: checking for any_errors_fatal 7491 1727203961.13423: done checking for any_errors_fatal 7491 1727203961.13423: checking for max_fail_percentage 7491 1727203961.13425: done checking for max_fail_percentage 7491 1727203961.13426: checking to see if all hosts have failed and the running result is not ok 7491 1727203961.13427: done checking to see if all hosts have failed 7491 1727203961.13428: getting the remaining hosts for this loop 7491 1727203961.13430: done getting the remaining hosts for this loop 7491 1727203961.13434: getting the next task for host managed-node3 7491 1727203961.13441: done getting next task for host managed-node3 7491 1727203961.13443: ^ task is: TASK: Install yum-utils package 7491 1727203961.13447: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203961.13451: getting variables 7491 1727203961.13453: in VariableManager get_vars() 7491 1727203961.13484: Calling all_inventory to load vars for managed-node3 7491 1727203961.13487: Calling groups_inventory to load vars for managed-node3 7491 1727203961.13491: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203961.13505: Calling all_plugins_play to load vars for managed-node3 7491 1727203961.13508: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203961.13512: Calling groups_plugins_play to load vars for managed-node3 7491 1727203961.13811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203961.14003: done with get_vars() 7491 1727203961.14013: done getting variables 7491 1727203961.14142: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 14:52:41 -0400 (0:00:00.024) 0:00:03.065 ***** 7491 1727203961.14179: entering _queue_task() for managed-node3/package 7491 1727203961.14182: Creating lock for package 7491 1727203961.14226: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000186 7491 1727203961.14236: WORKER PROCESS EXITING 7491 1727203961.14674: worker is 1 (out of 1 available) 7491 1727203961.14688: exiting _queue_task() for managed-node3/package 7491 1727203961.14700: done queuing things up, now waiting for results queue to drain 7491 1727203961.14701: waiting for pending results... 7491 1727203961.15107: running TaskExecutor() for managed-node3/TASK: Install yum-utils package 7491 1727203961.15476: in run() - task 0affcd87-79f5-0a4a-ad01-000000000187 7491 1727203961.15496: variable 'ansible_search_path' from source: unknown 7491 1727203961.15504: variable 'ansible_search_path' from source: unknown 7491 1727203961.15549: calling self._execute() 7491 1727203961.15668: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203961.15684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203961.15702: variable 'omit' from source: magic vars 7491 1727203961.16194: variable 'ansible_distribution' from source: facts 7491 1727203961.16213: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 7491 1727203961.16360: variable 'ansible_distribution_major_version' from source: facts 7491 1727203961.16375: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 7491 1727203961.16378: when evaluation is False, skipping this task 7491 1727203961.16381: _execute() done 7491 1727203961.16385: dumping result to json 7491 1727203961.16387: done dumping result, returning 7491 1727203961.16392: done running TaskExecutor() for managed-node3/TASK: Install yum-utils package [0affcd87-79f5-0a4a-ad01-000000000187] 7491 1727203961.16399: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000187 7491 1727203961.16496: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000187 7491 1727203961.16501: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 7491 1727203961.16547: no more pending results, returning what we have 7491 1727203961.16551: results queue empty 7491 1727203961.16552: checking for any_errors_fatal 7491 1727203961.16558: done checking for any_errors_fatal 7491 1727203961.16559: checking for max_fail_percentage 7491 1727203961.16560: done checking for max_fail_percentage 7491 1727203961.16561: checking to see if all hosts have failed and the running result is not ok 7491 1727203961.16562: done checking to see if all hosts have failed 7491 1727203961.16563: getting the remaining hosts for this loop 7491 1727203961.16567: done getting the remaining hosts for this loop 7491 1727203961.16571: getting the next task for host managed-node3 7491 1727203961.16577: done getting next task for host managed-node3 7491 1727203961.16579: ^ task is: TASK: Enable EPEL 7 7491 1727203961.16583: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203961.16587: getting variables 7491 1727203961.16589: in VariableManager get_vars() 7491 1727203961.16675: Calling all_inventory to load vars for managed-node3 7491 1727203961.16678: Calling groups_inventory to load vars for managed-node3 7491 1727203961.16682: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203961.16691: Calling all_plugins_play to load vars for managed-node3 7491 1727203961.16693: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203961.16695: Calling groups_plugins_play to load vars for managed-node3 7491 1727203961.16800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203961.16918: done with get_vars() 7491 1727203961.16926: done getting variables 7491 1727203961.16971: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 14:52:41 -0400 (0:00:00.028) 0:00:03.093 ***** 7491 1727203961.16993: entering _queue_task() for managed-node3/command 7491 1727203961.17191: worker is 1 (out of 1 available) 7491 1727203961.17206: exiting _queue_task() for managed-node3/command 7491 1727203961.17222: done queuing things up, now waiting for results queue to drain 7491 1727203961.17223: waiting for pending results... 7491 1727203961.17397: running TaskExecutor() for managed-node3/TASK: Enable EPEL 7 7491 1727203961.17522: in run() - task 0affcd87-79f5-0a4a-ad01-000000000188 7491 1727203961.17542: variable 'ansible_search_path' from source: unknown 7491 1727203961.17549: variable 'ansible_search_path' from source: unknown 7491 1727203961.17590: calling self._execute() 7491 1727203961.17685: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203961.17695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203961.17710: variable 'omit' from source: magic vars 7491 1727203961.18417: variable 'ansible_distribution' from source: facts 7491 1727203961.18445: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 7491 1727203961.18602: variable 'ansible_distribution_major_version' from source: facts 7491 1727203961.18613: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 7491 1727203961.18620: when evaluation is False, skipping this task 7491 1727203961.18626: _execute() done 7491 1727203961.18632: dumping result to json 7491 1727203961.18638: done dumping result, returning 7491 1727203961.18657: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 7 [0affcd87-79f5-0a4a-ad01-000000000188] 7491 1727203961.18689: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000188 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 7491 1727203961.18922: no more pending results, returning what we have 7491 1727203961.18926: results queue empty 7491 1727203961.18927: checking for any_errors_fatal 7491 1727203961.18933: done checking for any_errors_fatal 7491 1727203961.18934: checking for max_fail_percentage 7491 1727203961.18936: done checking for max_fail_percentage 7491 1727203961.18937: checking to see if all hosts have failed and the running result is not ok 7491 1727203961.18938: done checking to see if all hosts have failed 7491 1727203961.18939: getting the remaining hosts for this loop 7491 1727203961.18941: done getting the remaining hosts for this loop 7491 1727203961.18945: getting the next task for host managed-node3 7491 1727203961.18952: done getting next task for host managed-node3 7491 1727203961.18954: ^ task is: TASK: Enable EPEL 8 7491 1727203961.18959: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203961.18963: getting variables 7491 1727203961.18966: in VariableManager get_vars() 7491 1727203961.19012: Calling all_inventory to load vars for managed-node3 7491 1727203961.19016: Calling groups_inventory to load vars for managed-node3 7491 1727203961.19020: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203961.19034: Calling all_plugins_play to load vars for managed-node3 7491 1727203961.19037: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203961.19040: Calling groups_plugins_play to load vars for managed-node3 7491 1727203961.19248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203961.19582: done with get_vars() 7491 1727203961.19593: done getting variables 7491 1727203961.19636: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000188 7491 1727203961.19638: WORKER PROCESS EXITING 7491 1727203961.19686: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 14:52:41 -0400 (0:00:00.027) 0:00:03.121 ***** 7491 1727203961.19714: entering _queue_task() for managed-node3/command 7491 1727203961.19929: worker is 1 (out of 1 available) 7491 1727203961.19941: exiting _queue_task() for managed-node3/command 7491 1727203961.19952: done queuing things up, now waiting for results queue to drain 7491 1727203961.19953: waiting for pending results... 7491 1727203961.20089: running TaskExecutor() for managed-node3/TASK: Enable EPEL 8 7491 1727203961.20160: in run() - task 0affcd87-79f5-0a4a-ad01-000000000189 7491 1727203961.20171: variable 'ansible_search_path' from source: unknown 7491 1727203961.20176: variable 'ansible_search_path' from source: unknown 7491 1727203961.20203: calling self._execute() 7491 1727203961.20263: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203961.20268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203961.20276: variable 'omit' from source: magic vars 7491 1727203961.20598: variable 'ansible_distribution' from source: facts 7491 1727203961.20608: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 7491 1727203961.20702: variable 'ansible_distribution_major_version' from source: facts 7491 1727203961.20706: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 7491 1727203961.20709: when evaluation is False, skipping this task 7491 1727203961.20712: _execute() done 7491 1727203961.20714: dumping result to json 7491 1727203961.20717: done dumping result, returning 7491 1727203961.20727: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 8 [0affcd87-79f5-0a4a-ad01-000000000189] 7491 1727203961.20733: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000189 7491 1727203961.20819: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000189 7491 1727203961.20821: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 7491 1727203961.20873: no more pending results, returning what we have 7491 1727203961.20876: results queue empty 7491 1727203961.20877: checking for any_errors_fatal 7491 1727203961.20882: done checking for any_errors_fatal 7491 1727203961.20883: checking for max_fail_percentage 7491 1727203961.20884: done checking for max_fail_percentage 7491 1727203961.20885: checking to see if all hosts have failed and the running result is not ok 7491 1727203961.20886: done checking to see if all hosts have failed 7491 1727203961.20886: getting the remaining hosts for this loop 7491 1727203961.20888: done getting the remaining hosts for this loop 7491 1727203961.20891: getting the next task for host managed-node3 7491 1727203961.20898: done getting next task for host managed-node3 7491 1727203961.20900: ^ task is: TASK: Enable EPEL 6 7491 1727203961.20903: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203961.20906: getting variables 7491 1727203961.20907: in VariableManager get_vars() 7491 1727203961.20972: Calling all_inventory to load vars for managed-node3 7491 1727203961.20974: Calling groups_inventory to load vars for managed-node3 7491 1727203961.20977: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203961.20984: Calling all_plugins_play to load vars for managed-node3 7491 1727203961.20986: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203961.20987: Calling groups_plugins_play to load vars for managed-node3 7491 1727203961.21089: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203961.21207: done with get_vars() 7491 1727203961.21214: done getting variables 7491 1727203961.21252: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 14:52:41 -0400 (0:00:00.015) 0:00:03.136 ***** 7491 1727203961.21282: entering _queue_task() for managed-node3/copy 7491 1727203961.21466: worker is 1 (out of 1 available) 7491 1727203961.21596: exiting _queue_task() for managed-node3/copy 7491 1727203961.21607: done queuing things up, now waiting for results queue to drain 7491 1727203961.21608: waiting for pending results... 7491 1727203961.21725: running TaskExecutor() for managed-node3/TASK: Enable EPEL 6 7491 1727203961.21831: in run() - task 0affcd87-79f5-0a4a-ad01-00000000018b 7491 1727203961.21853: variable 'ansible_search_path' from source: unknown 7491 1727203961.21866: variable 'ansible_search_path' from source: unknown 7491 1727203961.21906: calling self._execute() 7491 1727203961.21991: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203961.22003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203961.22021: variable 'omit' from source: magic vars 7491 1727203961.22418: variable 'ansible_distribution' from source: facts 7491 1727203961.22437: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 7491 1727203961.22559: variable 'ansible_distribution_major_version' from source: facts 7491 1727203961.22573: Evaluated conditional (ansible_distribution_major_version == '6'): False 7491 1727203961.22583: when evaluation is False, skipping this task 7491 1727203961.22590: _execute() done 7491 1727203961.22597: dumping result to json 7491 1727203961.22604: done dumping result, returning 7491 1727203961.22614: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 6 [0affcd87-79f5-0a4a-ad01-00000000018b] 7491 1727203961.22633: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000018b skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 7491 1727203961.22793: no more pending results, returning what we have 7491 1727203961.22798: results queue empty 7491 1727203961.22799: checking for any_errors_fatal 7491 1727203961.22803: done checking for any_errors_fatal 7491 1727203961.22804: checking for max_fail_percentage 7491 1727203961.22806: done checking for max_fail_percentage 7491 1727203961.22806: checking to see if all hosts have failed and the running result is not ok 7491 1727203961.22807: done checking to see if all hosts have failed 7491 1727203961.22808: getting the remaining hosts for this loop 7491 1727203961.22810: done getting the remaining hosts for this loop 7491 1727203961.22814: getting the next task for host managed-node3 7491 1727203961.22826: done getting next task for host managed-node3 7491 1727203961.22829: ^ task is: TASK: Set network provider to 'nm' 7491 1727203961.22831: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203961.22835: getting variables 7491 1727203961.22836: in VariableManager get_vars() 7491 1727203961.22869: Calling all_inventory to load vars for managed-node3 7491 1727203961.22872: Calling groups_inventory to load vars for managed-node3 7491 1727203961.22876: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203961.22888: Calling all_plugins_play to load vars for managed-node3 7491 1727203961.22891: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203961.22894: Calling groups_plugins_play to load vars for managed-node3 7491 1727203961.23095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203961.23285: done with get_vars() 7491 1727203961.23296: done getting variables 7491 1727203961.23472: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000018b 7491 1727203961.23475: WORKER PROCESS EXITING 7491 1727203961.23503: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_auto_gateway_nm.yml:13 Tuesday 24 September 2024 14:52:41 -0400 (0:00:00.022) 0:00:03.159 ***** 7491 1727203961.23559: entering _queue_task() for managed-node3/set_fact 7491 1727203961.23866: worker is 1 (out of 1 available) 7491 1727203961.23878: exiting _queue_task() for managed-node3/set_fact 7491 1727203961.23891: done queuing things up, now waiting for results queue to drain 7491 1727203961.23892: waiting for pending results... 7491 1727203961.24051: running TaskExecutor() for managed-node3/TASK: Set network provider to 'nm' 7491 1727203961.24109: in run() - task 0affcd87-79f5-0a4a-ad01-000000000007 7491 1727203961.24126: variable 'ansible_search_path' from source: unknown 7491 1727203961.24157: calling self._execute() 7491 1727203961.24273: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203961.24283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203961.24292: variable 'omit' from source: magic vars 7491 1727203961.24369: variable 'omit' from source: magic vars 7491 1727203961.24404: variable 'omit' from source: magic vars 7491 1727203961.24432: variable 'omit' from source: magic vars 7491 1727203961.24466: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203961.24498: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203961.24523: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203961.24536: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203961.24545: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203961.24569: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203961.24572: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203961.24576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203961.24653: Set connection var ansible_timeout to 10 7491 1727203961.24658: Set connection var ansible_pipelining to False 7491 1727203961.24663: Set connection var ansible_shell_type to sh 7491 1727203961.24669: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203961.24676: Set connection var ansible_shell_executable to /bin/sh 7491 1727203961.24680: Set connection var ansible_connection to ssh 7491 1727203961.24697: variable 'ansible_shell_executable' from source: unknown 7491 1727203961.24700: variable 'ansible_connection' from source: unknown 7491 1727203961.24704: variable 'ansible_module_compression' from source: unknown 7491 1727203961.24712: variable 'ansible_shell_type' from source: unknown 7491 1727203961.24717: variable 'ansible_shell_executable' from source: unknown 7491 1727203961.24726: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203961.24729: variable 'ansible_pipelining' from source: unknown 7491 1727203961.24731: variable 'ansible_timeout' from source: unknown 7491 1727203961.24736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203961.24845: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203961.24853: variable 'omit' from source: magic vars 7491 1727203961.24858: starting attempt loop 7491 1727203961.24861: running the handler 7491 1727203961.24873: handler run complete 7491 1727203961.24881: attempt loop complete, returning result 7491 1727203961.24884: _execute() done 7491 1727203961.24886: dumping result to json 7491 1727203961.24888: done dumping result, returning 7491 1727203961.24894: done running TaskExecutor() for managed-node3/TASK: Set network provider to 'nm' [0affcd87-79f5-0a4a-ad01-000000000007] 7491 1727203961.24899: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000007 7491 1727203961.24977: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000007 7491 1727203961.24980: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 7491 1727203961.25033: no more pending results, returning what we have 7491 1727203961.25035: results queue empty 7491 1727203961.25036: checking for any_errors_fatal 7491 1727203961.25045: done checking for any_errors_fatal 7491 1727203961.25046: checking for max_fail_percentage 7491 1727203961.25047: done checking for max_fail_percentage 7491 1727203961.25048: checking to see if all hosts have failed and the running result is not ok 7491 1727203961.25048: done checking to see if all hosts have failed 7491 1727203961.25049: getting the remaining hosts for this loop 7491 1727203961.25051: done getting the remaining hosts for this loop 7491 1727203961.25054: getting the next task for host managed-node3 7491 1727203961.25059: done getting next task for host managed-node3 7491 1727203961.25061: ^ task is: TASK: meta (flush_handlers) 7491 1727203961.25063: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203961.25069: getting variables 7491 1727203961.25070: in VariableManager get_vars() 7491 1727203961.25134: Calling all_inventory to load vars for managed-node3 7491 1727203961.25136: Calling groups_inventory to load vars for managed-node3 7491 1727203961.25139: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203961.25146: Calling all_plugins_play to load vars for managed-node3 7491 1727203961.25147: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203961.25149: Calling groups_plugins_play to load vars for managed-node3 7491 1727203961.25252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203961.25414: done with get_vars() 7491 1727203961.25424: done getting variables 7491 1727203961.25499: in VariableManager get_vars() 7491 1727203961.25505: Calling all_inventory to load vars for managed-node3 7491 1727203961.25507: Calling groups_inventory to load vars for managed-node3 7491 1727203961.25527: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203961.25532: Calling all_plugins_play to load vars for managed-node3 7491 1727203961.25535: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203961.25537: Calling groups_plugins_play to load vars for managed-node3 7491 1727203961.25680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203961.25890: done with get_vars() 7491 1727203961.25903: done queuing things up, now waiting for results queue to drain 7491 1727203961.25905: results queue empty 7491 1727203961.25906: checking for any_errors_fatal 7491 1727203961.25908: done checking for any_errors_fatal 7491 1727203961.25908: checking for max_fail_percentage 7491 1727203961.25909: done checking for max_fail_percentage 7491 1727203961.25910: checking to see if all hosts have failed and the running result is not ok 7491 1727203961.25911: done checking to see if all hosts have failed 7491 1727203961.25911: getting the remaining hosts for this loop 7491 1727203961.25912: done getting the remaining hosts for this loop 7491 1727203961.25915: getting the next task for host managed-node3 7491 1727203961.25922: done getting next task for host managed-node3 7491 1727203961.25924: ^ task is: TASK: meta (flush_handlers) 7491 1727203961.25925: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203961.25935: getting variables 7491 1727203961.25937: in VariableManager get_vars() 7491 1727203961.25944: Calling all_inventory to load vars for managed-node3 7491 1727203961.25947: Calling groups_inventory to load vars for managed-node3 7491 1727203961.25949: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203961.25953: Calling all_plugins_play to load vars for managed-node3 7491 1727203961.25955: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203961.25958: Calling groups_plugins_play to load vars for managed-node3 7491 1727203961.26324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203961.26523: done with get_vars() 7491 1727203961.26529: done getting variables 7491 1727203961.26558: in VariableManager get_vars() 7491 1727203961.26569: Calling all_inventory to load vars for managed-node3 7491 1727203961.26571: Calling groups_inventory to load vars for managed-node3 7491 1727203961.26572: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203961.26576: Calling all_plugins_play to load vars for managed-node3 7491 1727203961.26577: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203961.26579: Calling groups_plugins_play to load vars for managed-node3 7491 1727203961.26684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203961.26881: done with get_vars() 7491 1727203961.26914: done queuing things up, now waiting for results queue to drain 7491 1727203961.26916: results queue empty 7491 1727203961.26917: checking for any_errors_fatal 7491 1727203961.26918: done checking for any_errors_fatal 7491 1727203961.26918: checking for max_fail_percentage 7491 1727203961.26920: done checking for max_fail_percentage 7491 1727203961.26920: checking to see if all hosts have failed and the running result is not ok 7491 1727203961.26921: done checking to see if all hosts have failed 7491 1727203961.26928: getting the remaining hosts for this loop 7491 1727203961.26930: done getting the remaining hosts for this loop 7491 1727203961.26941: getting the next task for host managed-node3 7491 1727203961.26944: done getting next task for host managed-node3 7491 1727203961.26945: ^ task is: None 7491 1727203961.26946: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203961.26948: done queuing things up, now waiting for results queue to drain 7491 1727203961.26948: results queue empty 7491 1727203961.26949: checking for any_errors_fatal 7491 1727203961.26950: done checking for any_errors_fatal 7491 1727203961.26951: checking for max_fail_percentage 7491 1727203961.26952: done checking for max_fail_percentage 7491 1727203961.26952: checking to see if all hosts have failed and the running result is not ok 7491 1727203961.26953: done checking to see if all hosts have failed 7491 1727203961.26955: getting the next task for host managed-node3 7491 1727203961.26957: done getting next task for host managed-node3 7491 1727203961.26958: ^ task is: None 7491 1727203961.26959: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203961.27005: in VariableManager get_vars() 7491 1727203961.27044: done with get_vars() 7491 1727203961.27051: in VariableManager get_vars() 7491 1727203961.27072: done with get_vars() 7491 1727203961.27076: variable 'omit' from source: magic vars 7491 1727203961.27107: in VariableManager get_vars() 7491 1727203961.27138: done with get_vars() 7491 1727203961.27160: variable 'omit' from source: magic vars PLAY [Play for testing auto_gateway setting] *********************************** 7491 1727203961.27836: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 7491 1727203961.27894: getting the remaining hosts for this loop 7491 1727203961.27895: done getting the remaining hosts for this loop 7491 1727203961.27901: getting the next task for host managed-node3 7491 1727203961.27905: done getting next task for host managed-node3 7491 1727203961.27906: ^ task is: TASK: Gathering Facts 7491 1727203961.27907: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203961.27908: getting variables 7491 1727203961.27909: in VariableManager get_vars() 7491 1727203961.27922: Calling all_inventory to load vars for managed-node3 7491 1727203961.27923: Calling groups_inventory to load vars for managed-node3 7491 1727203961.27925: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203961.27929: Calling all_plugins_play to load vars for managed-node3 7491 1727203961.27946: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203961.27954: Calling groups_plugins_play to load vars for managed-node3 7491 1727203961.28073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203961.28186: done with get_vars() 7491 1727203961.28192: done getting variables 7491 1727203961.28221: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:3 Tuesday 24 September 2024 14:52:41 -0400 (0:00:00.046) 0:00:03.206 ***** 7491 1727203961.28240: entering _queue_task() for managed-node3/gather_facts 7491 1727203961.28435: worker is 1 (out of 1 available) 7491 1727203961.28447: exiting _queue_task() for managed-node3/gather_facts 7491 1727203961.28458: done queuing things up, now waiting for results queue to drain 7491 1727203961.28459: waiting for pending results... 7491 1727203961.28608: running TaskExecutor() for managed-node3/TASK: Gathering Facts 7491 1727203961.28662: in run() - task 0affcd87-79f5-0a4a-ad01-0000000001b1 7491 1727203961.28685: variable 'ansible_search_path' from source: unknown 7491 1727203961.28711: calling self._execute() 7491 1727203961.28785: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203961.28789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203961.28799: variable 'omit' from source: magic vars 7491 1727203961.29076: variable 'ansible_distribution_major_version' from source: facts 7491 1727203961.29086: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203961.29095: variable 'omit' from source: magic vars 7491 1727203961.29125: variable 'omit' from source: magic vars 7491 1727203961.29147: variable 'omit' from source: magic vars 7491 1727203961.29179: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203961.29210: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203961.29271: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203961.29275: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203961.29278: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203961.29280: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203961.29282: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203961.29285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203961.29357: Set connection var ansible_timeout to 10 7491 1727203961.29361: Set connection var ansible_pipelining to False 7491 1727203961.29368: Set connection var ansible_shell_type to sh 7491 1727203961.29374: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203961.29381: Set connection var ansible_shell_executable to /bin/sh 7491 1727203961.29387: Set connection var ansible_connection to ssh 7491 1727203961.29403: variable 'ansible_shell_executable' from source: unknown 7491 1727203961.29406: variable 'ansible_connection' from source: unknown 7491 1727203961.29408: variable 'ansible_module_compression' from source: unknown 7491 1727203961.29411: variable 'ansible_shell_type' from source: unknown 7491 1727203961.29414: variable 'ansible_shell_executable' from source: unknown 7491 1727203961.29420: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203961.29424: variable 'ansible_pipelining' from source: unknown 7491 1727203961.29432: variable 'ansible_timeout' from source: unknown 7491 1727203961.29434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203961.29570: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203961.29579: variable 'omit' from source: magic vars 7491 1727203961.29582: starting attempt loop 7491 1727203961.29585: running the handler 7491 1727203961.29602: variable 'ansible_facts' from source: unknown 7491 1727203961.29619: _low_level_execute_command(): starting 7491 1727203961.29623: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203961.30408: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203961.30427: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203961.30444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203961.30466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203961.30508: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203961.30529: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203961.30543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203961.30572: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203961.30586: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203961.30597: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203961.30611: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203961.30632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203961.30648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203961.30660: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203961.30675: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203961.30689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203961.30773: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203961.30797: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203961.30814: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203961.30900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203961.33126: stdout chunk (state=3): >>>/root <<< 7491 1727203961.33283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203961.33393: stderr chunk (state=3): >>><<< 7491 1727203961.33406: stdout chunk (state=3): >>><<< 7491 1727203961.33548: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203961.33552: _low_level_execute_command(): starting 7491 1727203961.33555: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203961.3344126-7745-277190883353676 `" && echo ansible-tmp-1727203961.3344126-7745-277190883353676="` echo /root/.ansible/tmp/ansible-tmp-1727203961.3344126-7745-277190883353676 `" ) && sleep 0' 7491 1727203961.34213: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203961.34235: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203961.34250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203961.34272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203961.34351: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203961.34366: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203961.34395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203961.34442: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203961.34473: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203961.34484: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203961.34494: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203961.34507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203961.34527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203961.34557: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203961.34599: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203961.34603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203961.34683: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203961.34686: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203961.34736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203961.37334: stdout chunk (state=3): >>>ansible-tmp-1727203961.3344126-7745-277190883353676=/root/.ansible/tmp/ansible-tmp-1727203961.3344126-7745-277190883353676 <<< 7491 1727203961.37484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203961.37541: stderr chunk (state=3): >>><<< 7491 1727203961.37545: stdout chunk (state=3): >>><<< 7491 1727203961.37560: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203961.3344126-7745-277190883353676=/root/.ansible/tmp/ansible-tmp-1727203961.3344126-7745-277190883353676 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203961.37588: variable 'ansible_module_compression' from source: unknown 7491 1727203961.37636: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 7491 1727203961.37957: variable 'ansible_facts' from source: unknown 7491 1727203961.37961: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203961.3344126-7745-277190883353676/AnsiballZ_setup.py 7491 1727203961.38206: Sending initial data 7491 1727203961.38210: Sent initial data (152 bytes) 7491 1727203961.39447: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203961.39462: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203961.39481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203961.39499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203961.39544: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203961.39557: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203961.39578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203961.39598: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203961.39610: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203961.39624: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203961.39637: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203961.39651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203961.39669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203961.39682: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203961.39694: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203961.39709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203961.39792: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203961.39815: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203961.39837: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203961.39911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203961.42271: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203961.42319: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203961.42360: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpzkl4dmc4 /root/.ansible/tmp/ansible-tmp-1727203961.3344126-7745-277190883353676/AnsiballZ_setup.py <<< 7491 1727203961.42403: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203961.44695: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203961.44902: stderr chunk (state=3): >>><<< 7491 1727203961.44906: stdout chunk (state=3): >>><<< 7491 1727203961.44970: done transferring module to remote 7491 1727203961.44973: _low_level_execute_command(): starting 7491 1727203961.44979: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203961.3344126-7745-277190883353676/ /root/.ansible/tmp/ansible-tmp-1727203961.3344126-7745-277190883353676/AnsiballZ_setup.py && sleep 0' 7491 1727203961.46230: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203961.46388: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203961.46406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203961.46426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203961.46471: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203961.46484: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203961.46498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203961.46517: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203961.46530: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203961.46542: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203961.46554: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203961.46570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203961.46587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203961.46600: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203961.46612: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203961.46626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203961.46744: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203961.47112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203961.47128: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203961.47220: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203961.49703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203961.49708: stdout chunk (state=3): >>><<< 7491 1727203961.49711: stderr chunk (state=3): >>><<< 7491 1727203961.49841: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203961.49850: _low_level_execute_command(): starting 7491 1727203961.49853: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203961.3344126-7745-277190883353676/AnsiballZ_setup.py && sleep 0' 7491 1727203961.50720: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203961.51288: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203961.51306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203961.51326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203961.51372: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203961.51385: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203961.51401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203961.51419: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203961.51433: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203961.51445: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203961.51460: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203961.51471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203961.51484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203961.51492: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203961.51499: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203961.51509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203961.51583: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203961.51604: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203961.51619: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203961.51710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203962.19189: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", <<< 7491 1727203962.19235: stdout chunk (state=3): >>>"ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_loadavg": {"1m": 0.46, "5m": 0.21, "15m": 0.09}, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_is_chroot": false, "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", <<< 7491 1727203962.19244: stdout chunk (state=3): >>>"rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_t<<< 7491 1727203962.19284: stdout chunk (state=3): >>>hreads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2847, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 685, "free": 2847}, "nocache": {"free": 3288, "used": 244}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 307, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264287469568, "block_size": 4096, "block_total": 65519355, "block_available": 64523308, "block_used": 996047, "inode_total": 131071472, "inode_available": 130998346, "inode_used": 73126, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "52", "second": "42", "epoch": "1727203962", "epoch_int": "1727203962", "date": "2024-09-24", "time": "14:52:42", "iso8601_micro": "2024-09-24T18:52:42.187058Z", "iso8601": "2024-09-24T18:52:42Z", "iso8601_basic": "20240924T145242187058", "iso8601_basic_short": "20240924T145242", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 7491 1727203962.21591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203962.21655: stderr chunk (state=3): >>><<< 7491 1727203962.21659: stdout chunk (state=3): >>><<< 7491 1727203962.21694: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_loadavg": {"1m": 0.46, "5m": 0.21, "15m": 0.09}, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_is_chroot": false, "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2847, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 685, "free": 2847}, "nocache": {"free": 3288, "used": 244}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 307, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264287469568, "block_size": 4096, "block_total": 65519355, "block_available": 64523308, "block_used": 996047, "inode_total": 131071472, "inode_available": 130998346, "inode_used": 73126, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "52", "second": "42", "epoch": "1727203962", "epoch_int": "1727203962", "date": "2024-09-24", "time": "14:52:42", "iso8601_micro": "2024-09-24T18:52:42.187058Z", "iso8601": "2024-09-24T18:52:42Z", "iso8601_basic": "20240924T145242187058", "iso8601_basic_short": "20240924T145242", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203962.21909: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203961.3344126-7745-277190883353676/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203962.21928: _low_level_execute_command(): starting 7491 1727203962.21937: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203961.3344126-7745-277190883353676/ > /dev/null 2>&1 && sleep 0' 7491 1727203962.22408: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203962.22422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203962.22441: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203962.22453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203962.22462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203962.22510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203962.22521: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203962.22587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203962.25136: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203962.25200: stderr chunk (state=3): >>><<< 7491 1727203962.25204: stdout chunk (state=3): >>><<< 7491 1727203962.25219: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203962.25228: handler run complete 7491 1727203962.25313: variable 'ansible_facts' from source: unknown 7491 1727203962.25391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203962.25582: variable 'ansible_facts' from source: unknown 7491 1727203962.25642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203962.25722: attempt loop complete, returning result 7491 1727203962.25725: _execute() done 7491 1727203962.25728: dumping result to json 7491 1727203962.25747: done dumping result, returning 7491 1727203962.25754: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [0affcd87-79f5-0a4a-ad01-0000000001b1] 7491 1727203962.25760: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000001b1 7491 1727203962.26008: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000001b1 7491 1727203962.26011: WORKER PROCESS EXITING ok: [managed-node3] 7491 1727203962.26205: no more pending results, returning what we have 7491 1727203962.26208: results queue empty 7491 1727203962.26208: checking for any_errors_fatal 7491 1727203962.26209: done checking for any_errors_fatal 7491 1727203962.26210: checking for max_fail_percentage 7491 1727203962.26211: done checking for max_fail_percentage 7491 1727203962.26211: checking to see if all hosts have failed and the running result is not ok 7491 1727203962.26212: done checking to see if all hosts have failed 7491 1727203962.26212: getting the remaining hosts for this loop 7491 1727203962.26213: done getting the remaining hosts for this loop 7491 1727203962.26216: getting the next task for host managed-node3 7491 1727203962.26220: done getting next task for host managed-node3 7491 1727203962.26222: ^ task is: TASK: meta (flush_handlers) 7491 1727203962.26223: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203962.26225: getting variables 7491 1727203962.26226: in VariableManager get_vars() 7491 1727203962.26258: Calling all_inventory to load vars for managed-node3 7491 1727203962.26260: Calling groups_inventory to load vars for managed-node3 7491 1727203962.26262: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203962.26272: Calling all_plugins_play to load vars for managed-node3 7491 1727203962.26274: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203962.26276: Calling groups_plugins_play to load vars for managed-node3 7491 1727203962.26375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203962.26491: done with get_vars() 7491 1727203962.26499: done getting variables 7491 1727203962.26549: in VariableManager get_vars() 7491 1727203962.26562: Calling all_inventory to load vars for managed-node3 7491 1727203962.26566: Calling groups_inventory to load vars for managed-node3 7491 1727203962.26568: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203962.26572: Calling all_plugins_play to load vars for managed-node3 7491 1727203962.26573: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203962.26575: Calling groups_plugins_play to load vars for managed-node3 7491 1727203962.26667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203962.26778: done with get_vars() 7491 1727203962.26788: done queuing things up, now waiting for results queue to drain 7491 1727203962.26790: results queue empty 7491 1727203962.26790: checking for any_errors_fatal 7491 1727203962.26792: done checking for any_errors_fatal 7491 1727203962.26793: checking for max_fail_percentage 7491 1727203962.26793: done checking for max_fail_percentage 7491 1727203962.26795: checking to see if all hosts have failed and the running result is not ok 7491 1727203962.26801: done checking to see if all hosts have failed 7491 1727203962.26801: getting the remaining hosts for this loop 7491 1727203962.26802: done getting the remaining hosts for this loop 7491 1727203962.26804: getting the next task for host managed-node3 7491 1727203962.26806: done getting next task for host managed-node3 7491 1727203962.26808: ^ task is: TASK: Include the task 'show_interfaces.yml' 7491 1727203962.26809: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203962.26810: getting variables 7491 1727203962.26811: in VariableManager get_vars() 7491 1727203962.26823: Calling all_inventory to load vars for managed-node3 7491 1727203962.26825: Calling groups_inventory to load vars for managed-node3 7491 1727203962.26826: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203962.26829: Calling all_plugins_play to load vars for managed-node3 7491 1727203962.26830: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203962.26832: Calling groups_plugins_play to load vars for managed-node3 7491 1727203962.26911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203962.27022: done with get_vars() 7491 1727203962.27028: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:9 Tuesday 24 September 2024 14:52:42 -0400 (0:00:00.988) 0:00:04.194 ***** 7491 1727203962.27078: entering _queue_task() for managed-node3/include_tasks 7491 1727203962.27286: worker is 1 (out of 1 available) 7491 1727203962.27298: exiting _queue_task() for managed-node3/include_tasks 7491 1727203962.27310: done queuing things up, now waiting for results queue to drain 7491 1727203962.27312: waiting for pending results... 7491 1727203962.27465: running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' 7491 1727203962.27536: in run() - task 0affcd87-79f5-0a4a-ad01-00000000000b 7491 1727203962.27547: variable 'ansible_search_path' from source: unknown 7491 1727203962.27579: calling self._execute() 7491 1727203962.27648: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203962.27653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203962.27660: variable 'omit' from source: magic vars 7491 1727203962.27949: variable 'ansible_distribution_major_version' from source: facts 7491 1727203962.27959: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203962.27967: _execute() done 7491 1727203962.27970: dumping result to json 7491 1727203962.27973: done dumping result, returning 7491 1727203962.27978: done running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' [0affcd87-79f5-0a4a-ad01-00000000000b] 7491 1727203962.27986: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000000b 7491 1727203962.28076: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000000b 7491 1727203962.28079: WORKER PROCESS EXITING 7491 1727203962.28128: no more pending results, returning what we have 7491 1727203962.28133: in VariableManager get_vars() 7491 1727203962.28182: Calling all_inventory to load vars for managed-node3 7491 1727203962.28185: Calling groups_inventory to load vars for managed-node3 7491 1727203962.28187: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203962.28196: Calling all_plugins_play to load vars for managed-node3 7491 1727203962.28198: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203962.28200: Calling groups_plugins_play to load vars for managed-node3 7491 1727203962.28316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203962.28466: done with get_vars() 7491 1727203962.28472: variable 'ansible_search_path' from source: unknown 7491 1727203962.28481: we have included files to process 7491 1727203962.28482: generating all_blocks data 7491 1727203962.28483: done generating all_blocks data 7491 1727203962.28484: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7491 1727203962.28484: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7491 1727203962.28486: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7491 1727203962.28592: in VariableManager get_vars() 7491 1727203962.28609: done with get_vars() 7491 1727203962.28686: done processing included file 7491 1727203962.28688: iterating over new_blocks loaded from include file 7491 1727203962.28689: in VariableManager get_vars() 7491 1727203962.28703: done with get_vars() 7491 1727203962.28704: filtering new block on tags 7491 1727203962.28714: done filtering new block on tags 7491 1727203962.28715: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node3 7491 1727203962.28719: extending task lists for all hosts with included blocks 7491 1727203962.30996: done extending task lists 7491 1727203962.30998: done processing included files 7491 1727203962.30999: results queue empty 7491 1727203962.30999: checking for any_errors_fatal 7491 1727203962.31000: done checking for any_errors_fatal 7491 1727203962.31001: checking for max_fail_percentage 7491 1727203962.31002: done checking for max_fail_percentage 7491 1727203962.31002: checking to see if all hosts have failed and the running result is not ok 7491 1727203962.31003: done checking to see if all hosts have failed 7491 1727203962.31003: getting the remaining hosts for this loop 7491 1727203962.31004: done getting the remaining hosts for this loop 7491 1727203962.31006: getting the next task for host managed-node3 7491 1727203962.31009: done getting next task for host managed-node3 7491 1727203962.31010: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 7491 1727203962.31012: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203962.31013: getting variables 7491 1727203962.31014: in VariableManager get_vars() 7491 1727203962.31031: Calling all_inventory to load vars for managed-node3 7491 1727203962.31034: Calling groups_inventory to load vars for managed-node3 7491 1727203962.31036: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203962.31042: Calling all_plugins_play to load vars for managed-node3 7491 1727203962.31043: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203962.31046: Calling groups_plugins_play to load vars for managed-node3 7491 1727203962.31137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203962.31265: done with get_vars() 7491 1727203962.31274: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:52:42 -0400 (0:00:00.042) 0:00:04.237 ***** 7491 1727203962.31327: entering _queue_task() for managed-node3/include_tasks 7491 1727203962.31546: worker is 1 (out of 1 available) 7491 1727203962.31558: exiting _queue_task() for managed-node3/include_tasks 7491 1727203962.31573: done queuing things up, now waiting for results queue to drain 7491 1727203962.31574: waiting for pending results... 7491 1727203962.31732: running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' 7491 1727203962.31790: in run() - task 0affcd87-79f5-0a4a-ad01-0000000001ca 7491 1727203962.31803: variable 'ansible_search_path' from source: unknown 7491 1727203962.31807: variable 'ansible_search_path' from source: unknown 7491 1727203962.31841: calling self._execute() 7491 1727203962.31909: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203962.31913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203962.31925: variable 'omit' from source: magic vars 7491 1727203962.32203: variable 'ansible_distribution_major_version' from source: facts 7491 1727203962.32213: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203962.32222: _execute() done 7491 1727203962.32225: dumping result to json 7491 1727203962.32227: done dumping result, returning 7491 1727203962.32234: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' [0affcd87-79f5-0a4a-ad01-0000000001ca] 7491 1727203962.32243: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000001ca 7491 1727203962.32325: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000001ca 7491 1727203962.32328: WORKER PROCESS EXITING 7491 1727203962.32381: no more pending results, returning what we have 7491 1727203962.32385: in VariableManager get_vars() 7491 1727203962.32437: Calling all_inventory to load vars for managed-node3 7491 1727203962.32440: Calling groups_inventory to load vars for managed-node3 7491 1727203962.32442: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203962.32462: Calling all_plugins_play to load vars for managed-node3 7491 1727203962.32466: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203962.32470: Calling groups_plugins_play to load vars for managed-node3 7491 1727203962.32592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203962.32709: done with get_vars() 7491 1727203962.32714: variable 'ansible_search_path' from source: unknown 7491 1727203962.32715: variable 'ansible_search_path' from source: unknown 7491 1727203962.32742: we have included files to process 7491 1727203962.32743: generating all_blocks data 7491 1727203962.32744: done generating all_blocks data 7491 1727203962.32745: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7491 1727203962.32746: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7491 1727203962.32747: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7491 1727203962.32966: done processing included file 7491 1727203962.32967: iterating over new_blocks loaded from include file 7491 1727203962.32968: in VariableManager get_vars() 7491 1727203962.32984: done with get_vars() 7491 1727203962.32985: filtering new block on tags 7491 1727203962.32997: done filtering new block on tags 7491 1727203962.32998: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node3 7491 1727203962.33002: extending task lists for all hosts with included blocks 7491 1727203962.33061: done extending task lists 7491 1727203962.33062: done processing included files 7491 1727203962.33063: results queue empty 7491 1727203962.33063: checking for any_errors_fatal 7491 1727203962.33067: done checking for any_errors_fatal 7491 1727203962.33068: checking for max_fail_percentage 7491 1727203962.33068: done checking for max_fail_percentage 7491 1727203962.33069: checking to see if all hosts have failed and the running result is not ok 7491 1727203962.33069: done checking to see if all hosts have failed 7491 1727203962.33070: getting the remaining hosts for this loop 7491 1727203962.33071: done getting the remaining hosts for this loop 7491 1727203962.33072: getting the next task for host managed-node3 7491 1727203962.33075: done getting next task for host managed-node3 7491 1727203962.33076: ^ task is: TASK: Gather current interface info 7491 1727203962.33078: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203962.33079: getting variables 7491 1727203962.33080: in VariableManager get_vars() 7491 1727203962.33091: Calling all_inventory to load vars for managed-node3 7491 1727203962.33093: Calling groups_inventory to load vars for managed-node3 7491 1727203962.33094: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203962.33098: Calling all_plugins_play to load vars for managed-node3 7491 1727203962.33099: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203962.33101: Calling groups_plugins_play to load vars for managed-node3 7491 1727203962.33205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203962.33319: done with get_vars() 7491 1727203962.33326: done getting variables 7491 1727203962.33354: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:52:42 -0400 (0:00:00.020) 0:00:04.257 ***** 7491 1727203962.33375: entering _queue_task() for managed-node3/command 7491 1727203962.33570: worker is 1 (out of 1 available) 7491 1727203962.33582: exiting _queue_task() for managed-node3/command 7491 1727203962.33595: done queuing things up, now waiting for results queue to drain 7491 1727203962.33597: waiting for pending results... 7491 1727203962.33757: running TaskExecutor() for managed-node3/TASK: Gather current interface info 7491 1727203962.33825: in run() - task 0affcd87-79f5-0a4a-ad01-000000000389 7491 1727203962.33836: variable 'ansible_search_path' from source: unknown 7491 1727203962.33840: variable 'ansible_search_path' from source: unknown 7491 1727203962.33872: calling self._execute() 7491 1727203962.33938: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203962.33942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203962.33950: variable 'omit' from source: magic vars 7491 1727203962.34223: variable 'ansible_distribution_major_version' from source: facts 7491 1727203962.34234: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203962.34240: variable 'omit' from source: magic vars 7491 1727203962.34271: variable 'omit' from source: magic vars 7491 1727203962.34295: variable 'omit' from source: magic vars 7491 1727203962.34331: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203962.34357: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203962.34376: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203962.34390: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203962.34399: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203962.34425: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203962.34432: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203962.34435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203962.34505: Set connection var ansible_timeout to 10 7491 1727203962.34511: Set connection var ansible_pipelining to False 7491 1727203962.34518: Set connection var ansible_shell_type to sh 7491 1727203962.34526: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203962.34538: Set connection var ansible_shell_executable to /bin/sh 7491 1727203962.34542: Set connection var ansible_connection to ssh 7491 1727203962.34559: variable 'ansible_shell_executable' from source: unknown 7491 1727203962.34562: variable 'ansible_connection' from source: unknown 7491 1727203962.34566: variable 'ansible_module_compression' from source: unknown 7491 1727203962.34569: variable 'ansible_shell_type' from source: unknown 7491 1727203962.34571: variable 'ansible_shell_executable' from source: unknown 7491 1727203962.34576: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203962.34578: variable 'ansible_pipelining' from source: unknown 7491 1727203962.34581: variable 'ansible_timeout' from source: unknown 7491 1727203962.34585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203962.34690: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203962.34701: variable 'omit' from source: magic vars 7491 1727203962.34706: starting attempt loop 7491 1727203962.34709: running the handler 7491 1727203962.34723: _low_level_execute_command(): starting 7491 1727203962.34732: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203962.35261: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203962.35285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203962.35308: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203962.35323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203962.35362: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203962.35377: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203962.35437: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203962.37636: stdout chunk (state=3): >>>/root <<< 7491 1727203962.37786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203962.37838: stderr chunk (state=3): >>><<< 7491 1727203962.37842: stdout chunk (state=3): >>><<< 7491 1727203962.37863: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203962.37878: _low_level_execute_command(): starting 7491 1727203962.37885: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203962.378621-7799-210869019343536 `" && echo ansible-tmp-1727203962.378621-7799-210869019343536="` echo /root/.ansible/tmp/ansible-tmp-1727203962.378621-7799-210869019343536 `" ) && sleep 0' 7491 1727203962.38353: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203962.38357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203962.38390: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203962.38393: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203962.38404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203962.38460: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203962.38465: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203962.38470: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203962.38514: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203962.41049: stdout chunk (state=3): >>>ansible-tmp-1727203962.378621-7799-210869019343536=/root/.ansible/tmp/ansible-tmp-1727203962.378621-7799-210869019343536 <<< 7491 1727203962.41237: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203962.41277: stderr chunk (state=3): >>><<< 7491 1727203962.41280: stdout chunk (state=3): >>><<< 7491 1727203962.41295: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203962.378621-7799-210869019343536=/root/.ansible/tmp/ansible-tmp-1727203962.378621-7799-210869019343536 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203962.41330: variable 'ansible_module_compression' from source: unknown 7491 1727203962.41971: ANSIBALLZ: Using generic lock for ansible.legacy.command 7491 1727203962.41975: ANSIBALLZ: Acquiring lock 7491 1727203962.41977: ANSIBALLZ: Lock acquired: 139674606106048 7491 1727203962.41979: ANSIBALLZ: Creating module 7491 1727203962.59348: ANSIBALLZ: Writing module into payload 7491 1727203962.59587: ANSIBALLZ: Writing module 7491 1727203962.59627: ANSIBALLZ: Renaming module 7491 1727203962.59661: ANSIBALLZ: Done creating module 7491 1727203962.59782: variable 'ansible_facts' from source: unknown 7491 1727203962.59861: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203962.378621-7799-210869019343536/AnsiballZ_command.py 7491 1727203962.60169: Sending initial data 7491 1727203962.60178: Sent initial data (153 bytes) 7491 1727203962.61115: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203962.61122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203962.61148: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203962.61151: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203962.61153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203962.61213: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203962.61234: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203962.61252: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203962.61326: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 1 <<< 7491 1727203962.62999: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203962.63074: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203962.63110: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpu8smhged /root/.ansible/tmp/ansible-tmp-1727203962.378621-7799-210869019343536/AnsiballZ_command.py <<< 7491 1727203962.63126: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203962.64493: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203962.64703: stderr chunk (state=3): >>><<< 7491 1727203962.64707: stdout chunk (state=3): >>><<< 7491 1727203962.64709: done transferring module to remote 7491 1727203962.64712: _low_level_execute_command(): starting 7491 1727203962.64714: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203962.378621-7799-210869019343536/ /root/.ansible/tmp/ansible-tmp-1727203962.378621-7799-210869019343536/AnsiballZ_command.py && sleep 0' 7491 1727203962.65349: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203962.65353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203962.65384: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203962.65387: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203962.65389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203962.65445: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203962.65451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203962.65494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 3 <<< 7491 1727203962.67635: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203962.67719: stderr chunk (state=3): >>><<< 7491 1727203962.67731: stdout chunk (state=3): >>><<< 7491 1727203962.67748: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 3 debug2: Received exit status from master 0 7491 1727203962.67751: _low_level_execute_command(): starting 7491 1727203962.67760: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203962.378621-7799-210869019343536/AnsiballZ_command.py && sleep 0' 7491 1727203962.68491: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203962.68505: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203962.68529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203962.68548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203962.68594: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203962.68611: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203962.68632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203962.68669: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203962.68684: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203962.68696: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203962.68745: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203962.68777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203962.68813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203962.68845: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203962.68856: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203962.68872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203962.69183: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203962.69209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203962.89009: stdout chunk (state=3): >>> {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:52:42.884321", "end": "2024-09-24 14:52:42.889151", "delta": "0:00:00.004830", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7491 1727203962.90645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203962.90703: stderr chunk (state=3): >>><<< 7491 1727203962.90707: stdout chunk (state=3): >>><<< 7491 1727203962.90724: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:52:42.884321", "end": "2024-09-24 14:52:42.889151", "delta": "0:00:00.004830", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203962.90752: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203962.378621-7799-210869019343536/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203962.90759: _low_level_execute_command(): starting 7491 1727203962.90766: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203962.378621-7799-210869019343536/ > /dev/null 2>&1 && sleep 0' 7491 1727203962.91239: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203962.91244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203962.91273: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203962.91281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203962.91332: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203962.91335: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203962.91391: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203962.93762: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203962.93815: stderr chunk (state=3): >>><<< 7491 1727203962.93819: stdout chunk (state=3): >>><<< 7491 1727203962.93840: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203962.93843: handler run complete 7491 1727203962.93866: Evaluated conditional (False): False 7491 1727203962.93874: attempt loop complete, returning result 7491 1727203962.93877: _execute() done 7491 1727203962.93880: dumping result to json 7491 1727203962.93887: done dumping result, returning 7491 1727203962.93894: done running TaskExecutor() for managed-node3/TASK: Gather current interface info [0affcd87-79f5-0a4a-ad01-000000000389] 7491 1727203962.93899: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000389 7491 1727203962.94001: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000389 7491 1727203962.94004: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.004830", "end": "2024-09-24 14:52:42.889151", "rc": 0, "start": "2024-09-24 14:52:42.884321" } STDOUT: eth0 lo 7491 1727203962.94085: no more pending results, returning what we have 7491 1727203962.94088: results queue empty 7491 1727203962.94089: checking for any_errors_fatal 7491 1727203962.94090: done checking for any_errors_fatal 7491 1727203962.94091: checking for max_fail_percentage 7491 1727203962.94093: done checking for max_fail_percentage 7491 1727203962.94093: checking to see if all hosts have failed and the running result is not ok 7491 1727203962.94094: done checking to see if all hosts have failed 7491 1727203962.94095: getting the remaining hosts for this loop 7491 1727203962.94097: done getting the remaining hosts for this loop 7491 1727203962.94101: getting the next task for host managed-node3 7491 1727203962.94106: done getting next task for host managed-node3 7491 1727203962.94108: ^ task is: TASK: Set current_interfaces 7491 1727203962.94112: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203962.94115: getting variables 7491 1727203962.94117: in VariableManager get_vars() 7491 1727203962.94162: Calling all_inventory to load vars for managed-node3 7491 1727203962.94167: Calling groups_inventory to load vars for managed-node3 7491 1727203962.94170: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203962.94182: Calling all_plugins_play to load vars for managed-node3 7491 1727203962.94184: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203962.94187: Calling groups_plugins_play to load vars for managed-node3 7491 1727203962.94315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203962.94435: done with get_vars() 7491 1727203962.94443: done getting variables 7491 1727203962.94490: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:52:42 -0400 (0:00:00.611) 0:00:04.869 ***** 7491 1727203962.94511: entering _queue_task() for managed-node3/set_fact 7491 1727203962.94744: worker is 1 (out of 1 available) 7491 1727203962.94756: exiting _queue_task() for managed-node3/set_fact 7491 1727203962.94778: done queuing things up, now waiting for results queue to drain 7491 1727203962.94780: waiting for pending results... 7491 1727203962.95193: running TaskExecutor() for managed-node3/TASK: Set current_interfaces 7491 1727203962.95203: in run() - task 0affcd87-79f5-0a4a-ad01-00000000038a 7491 1727203962.95206: variable 'ansible_search_path' from source: unknown 7491 1727203962.95209: variable 'ansible_search_path' from source: unknown 7491 1727203962.95211: calling self._execute() 7491 1727203962.95272: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203962.95282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203962.95295: variable 'omit' from source: magic vars 7491 1727203962.95810: variable 'ansible_distribution_major_version' from source: facts 7491 1727203962.95827: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203962.95837: variable 'omit' from source: magic vars 7491 1727203962.95872: variable 'omit' from source: magic vars 7491 1727203962.95974: variable '_current_interfaces' from source: set_fact 7491 1727203962.96048: variable 'omit' from source: magic vars 7491 1727203962.96092: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203962.96145: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203962.96173: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203962.96195: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203962.96209: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203962.96256: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203962.96267: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203962.96275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203962.96397: Set connection var ansible_timeout to 10 7491 1727203962.96411: Set connection var ansible_pipelining to False 7491 1727203962.96425: Set connection var ansible_shell_type to sh 7491 1727203962.96449: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203962.96468: Set connection var ansible_shell_executable to /bin/sh 7491 1727203962.96478: Set connection var ansible_connection to ssh 7491 1727203962.96511: variable 'ansible_shell_executable' from source: unknown 7491 1727203962.96521: variable 'ansible_connection' from source: unknown 7491 1727203962.96529: variable 'ansible_module_compression' from source: unknown 7491 1727203962.96534: variable 'ansible_shell_type' from source: unknown 7491 1727203962.96549: variable 'ansible_shell_executable' from source: unknown 7491 1727203962.96560: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203962.96571: variable 'ansible_pipelining' from source: unknown 7491 1727203962.96578: variable 'ansible_timeout' from source: unknown 7491 1727203962.96585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203962.96740: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203962.96757: variable 'omit' from source: magic vars 7491 1727203962.96775: starting attempt loop 7491 1727203962.96785: running the handler 7491 1727203962.96801: handler run complete 7491 1727203962.96815: attempt loop complete, returning result 7491 1727203962.96825: _execute() done 7491 1727203962.96831: dumping result to json 7491 1727203962.96838: done dumping result, returning 7491 1727203962.96848: done running TaskExecutor() for managed-node3/TASK: Set current_interfaces [0affcd87-79f5-0a4a-ad01-00000000038a] 7491 1727203962.96858: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000038a 7491 1727203962.96977: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000038a ok: [managed-node3] => { "ansible_facts": { "current_interfaces": [ "eth0", "lo" ] }, "changed": false } 7491 1727203962.97049: no more pending results, returning what we have 7491 1727203962.97052: results queue empty 7491 1727203962.97054: checking for any_errors_fatal 7491 1727203962.97060: done checking for any_errors_fatal 7491 1727203962.97061: checking for max_fail_percentage 7491 1727203962.97063: done checking for max_fail_percentage 7491 1727203962.97066: checking to see if all hosts have failed and the running result is not ok 7491 1727203962.97067: done checking to see if all hosts have failed 7491 1727203962.97068: getting the remaining hosts for this loop 7491 1727203962.97070: done getting the remaining hosts for this loop 7491 1727203962.97074: getting the next task for host managed-node3 7491 1727203962.97083: done getting next task for host managed-node3 7491 1727203962.97086: ^ task is: TASK: Show current_interfaces 7491 1727203962.97089: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203962.97093: getting variables 7491 1727203962.97094: in VariableManager get_vars() 7491 1727203962.97151: Calling all_inventory to load vars for managed-node3 7491 1727203962.97155: Calling groups_inventory to load vars for managed-node3 7491 1727203962.97158: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203962.97171: Calling all_plugins_play to load vars for managed-node3 7491 1727203962.97174: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203962.97176: Calling groups_plugins_play to load vars for managed-node3 7491 1727203962.97401: WORKER PROCESS EXITING 7491 1727203962.97425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203962.97615: done with get_vars() 7491 1727203962.97626: done getting variables 7491 1727203962.97700: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:52:42 -0400 (0:00:00.032) 0:00:04.901 ***** 7491 1727203962.97732: entering _queue_task() for managed-node3/debug 7491 1727203962.97733: Creating lock for debug 7491 1727203962.97933: worker is 1 (out of 1 available) 7491 1727203962.97947: exiting _queue_task() for managed-node3/debug 7491 1727203962.97959: done queuing things up, now waiting for results queue to drain 7491 1727203962.97961: waiting for pending results... 7491 1727203962.98114: running TaskExecutor() for managed-node3/TASK: Show current_interfaces 7491 1727203962.98174: in run() - task 0affcd87-79f5-0a4a-ad01-0000000001cb 7491 1727203962.98186: variable 'ansible_search_path' from source: unknown 7491 1727203962.98190: variable 'ansible_search_path' from source: unknown 7491 1727203962.98220: calling self._execute() 7491 1727203962.98283: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203962.98289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203962.98297: variable 'omit' from source: magic vars 7491 1727203962.98569: variable 'ansible_distribution_major_version' from source: facts 7491 1727203962.98581: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203962.98586: variable 'omit' from source: magic vars 7491 1727203962.98612: variable 'omit' from source: magic vars 7491 1727203962.98685: variable 'current_interfaces' from source: set_fact 7491 1727203962.98704: variable 'omit' from source: magic vars 7491 1727203962.98736: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203962.98761: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203962.98779: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203962.98800: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203962.98808: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203962.98840: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203962.98849: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203962.98852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203962.98924: Set connection var ansible_timeout to 10 7491 1727203962.98930: Set connection var ansible_pipelining to False 7491 1727203962.98935: Set connection var ansible_shell_type to sh 7491 1727203962.98940: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203962.98947: Set connection var ansible_shell_executable to /bin/sh 7491 1727203962.98951: Set connection var ansible_connection to ssh 7491 1727203962.98971: variable 'ansible_shell_executable' from source: unknown 7491 1727203962.98975: variable 'ansible_connection' from source: unknown 7491 1727203962.98977: variable 'ansible_module_compression' from source: unknown 7491 1727203962.98979: variable 'ansible_shell_type' from source: unknown 7491 1727203962.98981: variable 'ansible_shell_executable' from source: unknown 7491 1727203962.98983: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203962.98986: variable 'ansible_pipelining' from source: unknown 7491 1727203962.98988: variable 'ansible_timeout' from source: unknown 7491 1727203962.98998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203962.99090: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203962.99098: variable 'omit' from source: magic vars 7491 1727203962.99103: starting attempt loop 7491 1727203962.99105: running the handler 7491 1727203962.99142: handler run complete 7491 1727203962.99152: attempt loop complete, returning result 7491 1727203962.99155: _execute() done 7491 1727203962.99157: dumping result to json 7491 1727203962.99160: done dumping result, returning 7491 1727203962.99167: done running TaskExecutor() for managed-node3/TASK: Show current_interfaces [0affcd87-79f5-0a4a-ad01-0000000001cb] 7491 1727203962.99173: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000001cb 7491 1727203962.99262: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000001cb 7491 1727203962.99267: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: current_interfaces: ['eth0', 'lo'] 7491 1727203962.99412: no more pending results, returning what we have 7491 1727203962.99415: results queue empty 7491 1727203962.99416: checking for any_errors_fatal 7491 1727203962.99420: done checking for any_errors_fatal 7491 1727203962.99421: checking for max_fail_percentage 7491 1727203962.99422: done checking for max_fail_percentage 7491 1727203962.99423: checking to see if all hosts have failed and the running result is not ok 7491 1727203962.99424: done checking to see if all hosts have failed 7491 1727203962.99424: getting the remaining hosts for this loop 7491 1727203962.99426: done getting the remaining hosts for this loop 7491 1727203962.99429: getting the next task for host managed-node3 7491 1727203962.99435: done getting next task for host managed-node3 7491 1727203962.99438: ^ task is: TASK: Include the task 'manage_test_interface.yml' 7491 1727203962.99439: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203962.99443: getting variables 7491 1727203962.99444: in VariableManager get_vars() 7491 1727203962.99488: Calling all_inventory to load vars for managed-node3 7491 1727203962.99491: Calling groups_inventory to load vars for managed-node3 7491 1727203962.99493: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203962.99502: Calling all_plugins_play to load vars for managed-node3 7491 1727203962.99505: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203962.99508: Calling groups_plugins_play to load vars for managed-node3 7491 1727203962.99674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203962.99870: done with get_vars() 7491 1727203962.99880: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:11 Tuesday 24 September 2024 14:52:42 -0400 (0:00:00.022) 0:00:04.923 ***** 7491 1727203962.99967: entering _queue_task() for managed-node3/include_tasks 7491 1727203963.00412: worker is 1 (out of 1 available) 7491 1727203963.00428: exiting _queue_task() for managed-node3/include_tasks 7491 1727203963.00440: done queuing things up, now waiting for results queue to drain 7491 1727203963.00442: waiting for pending results... 7491 1727203963.00931: running TaskExecutor() for managed-node3/TASK: Include the task 'manage_test_interface.yml' 7491 1727203963.01028: in run() - task 0affcd87-79f5-0a4a-ad01-00000000000c 7491 1727203963.01047: variable 'ansible_search_path' from source: unknown 7491 1727203963.01092: calling self._execute() 7491 1727203963.01285: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203963.01299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203963.01314: variable 'omit' from source: magic vars 7491 1727203963.01698: variable 'ansible_distribution_major_version' from source: facts 7491 1727203963.01719: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203963.01731: _execute() done 7491 1727203963.01738: dumping result to json 7491 1727203963.01746: done dumping result, returning 7491 1727203963.01760: done running TaskExecutor() for managed-node3/TASK: Include the task 'manage_test_interface.yml' [0affcd87-79f5-0a4a-ad01-00000000000c] 7491 1727203963.01789: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000000c 7491 1727203963.01952: no more pending results, returning what we have 7491 1727203963.01958: in VariableManager get_vars() 7491 1727203963.02082: Calling all_inventory to load vars for managed-node3 7491 1727203963.02085: Calling groups_inventory to load vars for managed-node3 7491 1727203963.02089: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203963.02105: Calling all_plugins_play to load vars for managed-node3 7491 1727203963.02108: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203963.02111: Calling groups_plugins_play to load vars for managed-node3 7491 1727203963.02246: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000000c 7491 1727203963.02249: WORKER PROCESS EXITING 7491 1727203963.02261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203963.02378: done with get_vars() 7491 1727203963.02383: variable 'ansible_search_path' from source: unknown 7491 1727203963.02393: we have included files to process 7491 1727203963.02393: generating all_blocks data 7491 1727203963.02395: done generating all_blocks data 7491 1727203963.02398: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7491 1727203963.02399: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7491 1727203963.02400: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7491 1727203963.02775: in VariableManager get_vars() 7491 1727203963.02793: done with get_vars() 7491 1727203963.02943: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 7491 1727203963.03322: done processing included file 7491 1727203963.03323: iterating over new_blocks loaded from include file 7491 1727203963.03324: in VariableManager get_vars() 7491 1727203963.03339: done with get_vars() 7491 1727203963.03340: filtering new block on tags 7491 1727203963.03362: done filtering new block on tags 7491 1727203963.03364: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node3 7491 1727203963.03369: extending task lists for all hosts with included blocks 7491 1727203963.07254: done extending task lists 7491 1727203963.07256: done processing included files 7491 1727203963.07257: results queue empty 7491 1727203963.07258: checking for any_errors_fatal 7491 1727203963.07269: done checking for any_errors_fatal 7491 1727203963.07275: checking for max_fail_percentage 7491 1727203963.07276: done checking for max_fail_percentage 7491 1727203963.07277: checking to see if all hosts have failed and the running result is not ok 7491 1727203963.07278: done checking to see if all hosts have failed 7491 1727203963.07279: getting the remaining hosts for this loop 7491 1727203963.07280: done getting the remaining hosts for this loop 7491 1727203963.07283: getting the next task for host managed-node3 7491 1727203963.07287: done getting next task for host managed-node3 7491 1727203963.07289: ^ task is: TASK: Ensure state in ["present", "absent"] 7491 1727203963.07292: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203963.07294: getting variables 7491 1727203963.07300: in VariableManager get_vars() 7491 1727203963.07332: Calling all_inventory to load vars for managed-node3 7491 1727203963.07335: Calling groups_inventory to load vars for managed-node3 7491 1727203963.07337: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203963.07342: Calling all_plugins_play to load vars for managed-node3 7491 1727203963.07345: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203963.07347: Calling groups_plugins_play to load vars for managed-node3 7491 1727203963.07628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203963.07836: done with get_vars() 7491 1727203963.07854: done getting variables 7491 1727203963.07945: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 14:52:43 -0400 (0:00:00.080) 0:00:05.003 ***** 7491 1727203963.07975: entering _queue_task() for managed-node3/fail 7491 1727203963.07976: Creating lock for fail 7491 1727203963.08638: worker is 1 (out of 1 available) 7491 1727203963.08650: exiting _queue_task() for managed-node3/fail 7491 1727203963.08662: done queuing things up, now waiting for results queue to drain 7491 1727203963.08666: waiting for pending results... 7491 1727203963.08939: running TaskExecutor() for managed-node3/TASK: Ensure state in ["present", "absent"] 7491 1727203963.09062: in run() - task 0affcd87-79f5-0a4a-ad01-0000000003a5 7491 1727203963.09085: variable 'ansible_search_path' from source: unknown 7491 1727203963.09093: variable 'ansible_search_path' from source: unknown 7491 1727203963.09144: calling self._execute() 7491 1727203963.09242: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203963.09254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203963.09271: variable 'omit' from source: magic vars 7491 1727203963.09712: variable 'ansible_distribution_major_version' from source: facts 7491 1727203963.09735: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203963.09899: variable 'state' from source: include params 7491 1727203963.09910: Evaluated conditional (state not in ["present", "absent"]): False 7491 1727203963.09921: when evaluation is False, skipping this task 7491 1727203963.09929: _execute() done 7491 1727203963.09943: dumping result to json 7491 1727203963.09952: done dumping result, returning 7491 1727203963.09969: done running TaskExecutor() for managed-node3/TASK: Ensure state in ["present", "absent"] [0affcd87-79f5-0a4a-ad01-0000000003a5] 7491 1727203963.09992: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000003a5 skipping: [managed-node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 7491 1727203963.10148: no more pending results, returning what we have 7491 1727203963.10151: results queue empty 7491 1727203963.10153: checking for any_errors_fatal 7491 1727203963.10154: done checking for any_errors_fatal 7491 1727203963.10155: checking for max_fail_percentage 7491 1727203963.10157: done checking for max_fail_percentage 7491 1727203963.10158: checking to see if all hosts have failed and the running result is not ok 7491 1727203963.10159: done checking to see if all hosts have failed 7491 1727203963.10160: getting the remaining hosts for this loop 7491 1727203963.10162: done getting the remaining hosts for this loop 7491 1727203963.10168: getting the next task for host managed-node3 7491 1727203963.10174: done getting next task for host managed-node3 7491 1727203963.10176: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 7491 1727203963.10179: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203963.10183: getting variables 7491 1727203963.10185: in VariableManager get_vars() 7491 1727203963.10244: Calling all_inventory to load vars for managed-node3 7491 1727203963.10247: Calling groups_inventory to load vars for managed-node3 7491 1727203963.10249: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203963.10265: Calling all_plugins_play to load vars for managed-node3 7491 1727203963.10268: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203963.10271: Calling groups_plugins_play to load vars for managed-node3 7491 1727203963.10486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203963.10762: done with get_vars() 7491 1727203963.10774: done getting variables 7491 1727203963.10848: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 14:52:43 -0400 (0:00:00.029) 0:00:05.032 ***** 7491 1727203963.10884: entering _queue_task() for managed-node3/fail 7491 1727203963.10902: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000003a5 7491 1727203963.10911: WORKER PROCESS EXITING 7491 1727203963.11363: worker is 1 (out of 1 available) 7491 1727203963.11376: exiting _queue_task() for managed-node3/fail 7491 1727203963.11389: done queuing things up, now waiting for results queue to drain 7491 1727203963.11390: waiting for pending results... 7491 1727203963.12312: running TaskExecutor() for managed-node3/TASK: Ensure type in ["dummy", "tap", "veth"] 7491 1727203963.12448: in run() - task 0affcd87-79f5-0a4a-ad01-0000000003a6 7491 1727203963.12477: variable 'ansible_search_path' from source: unknown 7491 1727203963.12487: variable 'ansible_search_path' from source: unknown 7491 1727203963.12532: calling self._execute() 7491 1727203963.12630: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203963.12642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203963.12655: variable 'omit' from source: magic vars 7491 1727203963.13063: variable 'ansible_distribution_major_version' from source: facts 7491 1727203963.13084: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203963.13238: variable 'type' from source: play vars 7491 1727203963.13256: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 7491 1727203963.13263: when evaluation is False, skipping this task 7491 1727203963.13271: _execute() done 7491 1727203963.13277: dumping result to json 7491 1727203963.13284: done dumping result, returning 7491 1727203963.13292: done running TaskExecutor() for managed-node3/TASK: Ensure type in ["dummy", "tap", "veth"] [0affcd87-79f5-0a4a-ad01-0000000003a6] 7491 1727203963.13301: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000003a6 skipping: [managed-node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 7491 1727203963.13456: no more pending results, returning what we have 7491 1727203963.13460: results queue empty 7491 1727203963.13461: checking for any_errors_fatal 7491 1727203963.13471: done checking for any_errors_fatal 7491 1727203963.13472: checking for max_fail_percentage 7491 1727203963.13474: done checking for max_fail_percentage 7491 1727203963.13475: checking to see if all hosts have failed and the running result is not ok 7491 1727203963.13477: done checking to see if all hosts have failed 7491 1727203963.13478: getting the remaining hosts for this loop 7491 1727203963.13480: done getting the remaining hosts for this loop 7491 1727203963.13484: getting the next task for host managed-node3 7491 1727203963.13491: done getting next task for host managed-node3 7491 1727203963.13494: ^ task is: TASK: Include the task 'show_interfaces.yml' 7491 1727203963.13498: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203963.13503: getting variables 7491 1727203963.13505: in VariableManager get_vars() 7491 1727203963.13570: Calling all_inventory to load vars for managed-node3 7491 1727203963.13573: Calling groups_inventory to load vars for managed-node3 7491 1727203963.13576: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203963.13592: Calling all_plugins_play to load vars for managed-node3 7491 1727203963.13595: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203963.13599: Calling groups_plugins_play to load vars for managed-node3 7491 1727203963.13796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203963.13994: done with get_vars() 7491 1727203963.14036: done getting variables 7491 1727203963.14573: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000003a6 7491 1727203963.14577: WORKER PROCESS EXITING TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 14:52:43 -0400 (0:00:00.037) 0:00:05.070 ***** 7491 1727203963.14640: entering _queue_task() for managed-node3/include_tasks 7491 1727203963.14897: worker is 1 (out of 1 available) 7491 1727203963.14909: exiting _queue_task() for managed-node3/include_tasks 7491 1727203963.14926: done queuing things up, now waiting for results queue to drain 7491 1727203963.14928: waiting for pending results... 7491 1727203963.15193: running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' 7491 1727203963.15303: in run() - task 0affcd87-79f5-0a4a-ad01-0000000003a7 7491 1727203963.15326: variable 'ansible_search_path' from source: unknown 7491 1727203963.15334: variable 'ansible_search_path' from source: unknown 7491 1727203963.15380: calling self._execute() 7491 1727203963.15469: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203963.15484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203963.15497: variable 'omit' from source: magic vars 7491 1727203963.15966: variable 'ansible_distribution_major_version' from source: facts 7491 1727203963.15984: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203963.15994: _execute() done 7491 1727203963.16003: dumping result to json 7491 1727203963.16010: done dumping result, returning 7491 1727203963.16026: done running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' [0affcd87-79f5-0a4a-ad01-0000000003a7] 7491 1727203963.16038: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000003a7 7491 1727203963.16161: no more pending results, returning what we have 7491 1727203963.16169: in VariableManager get_vars() 7491 1727203963.16233: Calling all_inventory to load vars for managed-node3 7491 1727203963.16237: Calling groups_inventory to load vars for managed-node3 7491 1727203963.16240: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203963.16255: Calling all_plugins_play to load vars for managed-node3 7491 1727203963.16258: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203963.16262: Calling groups_plugins_play to load vars for managed-node3 7491 1727203963.16547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203963.16778: done with get_vars() 7491 1727203963.16786: variable 'ansible_search_path' from source: unknown 7491 1727203963.16787: variable 'ansible_search_path' from source: unknown 7491 1727203963.16838: we have included files to process 7491 1727203963.16839: generating all_blocks data 7491 1727203963.16842: done generating all_blocks data 7491 1727203963.16848: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7491 1727203963.16849: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7491 1727203963.16851: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7491 1727203963.16965: in VariableManager get_vars() 7491 1727203963.16995: done with get_vars() 7491 1727203963.17123: done processing included file 7491 1727203963.17125: iterating over new_blocks loaded from include file 7491 1727203963.17127: in VariableManager get_vars() 7491 1727203963.17151: done with get_vars() 7491 1727203963.17153: filtering new block on tags 7491 1727203963.17175: done filtering new block on tags 7491 1727203963.17177: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node3 7491 1727203963.17184: extending task lists for all hosts with included blocks 7491 1727203963.17788: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000003a7 7491 1727203963.17792: WORKER PROCESS EXITING 7491 1727203963.17869: done extending task lists 7491 1727203963.17871: done processing included files 7491 1727203963.17871: results queue empty 7491 1727203963.17872: checking for any_errors_fatal 7491 1727203963.17876: done checking for any_errors_fatal 7491 1727203963.17877: checking for max_fail_percentage 7491 1727203963.17878: done checking for max_fail_percentage 7491 1727203963.17879: checking to see if all hosts have failed and the running result is not ok 7491 1727203963.17880: done checking to see if all hosts have failed 7491 1727203963.17881: getting the remaining hosts for this loop 7491 1727203963.17882: done getting the remaining hosts for this loop 7491 1727203963.17885: getting the next task for host managed-node3 7491 1727203963.17890: done getting next task for host managed-node3 7491 1727203963.17892: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 7491 1727203963.17896: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203963.17899: getting variables 7491 1727203963.17900: in VariableManager get_vars() 7491 1727203963.17918: Calling all_inventory to load vars for managed-node3 7491 1727203963.17921: Calling groups_inventory to load vars for managed-node3 7491 1727203963.17923: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203963.17928: Calling all_plugins_play to load vars for managed-node3 7491 1727203963.17930: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203963.17933: Calling groups_plugins_play to load vars for managed-node3 7491 1727203963.18073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203963.18286: done with get_vars() 7491 1727203963.18296: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:52:43 -0400 (0:00:00.037) 0:00:05.107 ***** 7491 1727203963.18369: entering _queue_task() for managed-node3/include_tasks 7491 1727203963.18615: worker is 1 (out of 1 available) 7491 1727203963.18628: exiting _queue_task() for managed-node3/include_tasks 7491 1727203963.18640: done queuing things up, now waiting for results queue to drain 7491 1727203963.18642: waiting for pending results... 7491 1727203963.18917: running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' 7491 1727203963.19046: in run() - task 0affcd87-79f5-0a4a-ad01-00000000057e 7491 1727203963.19074: variable 'ansible_search_path' from source: unknown 7491 1727203963.19086: variable 'ansible_search_path' from source: unknown 7491 1727203963.19132: calling self._execute() 7491 1727203963.19249: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203963.19260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203963.19277: variable 'omit' from source: magic vars 7491 1727203963.19656: variable 'ansible_distribution_major_version' from source: facts 7491 1727203963.19748: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203963.19759: _execute() done 7491 1727203963.19769: dumping result to json 7491 1727203963.19848: done dumping result, returning 7491 1727203963.19858: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' [0affcd87-79f5-0a4a-ad01-00000000057e] 7491 1727203963.19873: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000057e 7491 1727203963.20000: no more pending results, returning what we have 7491 1727203963.20010: in VariableManager get_vars() 7491 1727203963.20075: Calling all_inventory to load vars for managed-node3 7491 1727203963.20079: Calling groups_inventory to load vars for managed-node3 7491 1727203963.20081: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203963.20095: Calling all_plugins_play to load vars for managed-node3 7491 1727203963.20098: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203963.20100: Calling groups_plugins_play to load vars for managed-node3 7491 1727203963.20292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203963.20488: done with get_vars() 7491 1727203963.20496: variable 'ansible_search_path' from source: unknown 7491 1727203963.20497: variable 'ansible_search_path' from source: unknown 7491 1727203963.20559: we have included files to process 7491 1727203963.20561: generating all_blocks data 7491 1727203963.20563: done generating all_blocks data 7491 1727203963.20567: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7491 1727203963.20568: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7491 1727203963.20571: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7491 1727203963.21186: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000057e 7491 1727203963.21189: WORKER PROCESS EXITING 7491 1727203963.21373: done processing included file 7491 1727203963.21375: iterating over new_blocks loaded from include file 7491 1727203963.21377: in VariableManager get_vars() 7491 1727203963.21405: done with get_vars() 7491 1727203963.21408: filtering new block on tags 7491 1727203963.21426: done filtering new block on tags 7491 1727203963.21428: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node3 7491 1727203963.21433: extending task lists for all hosts with included blocks 7491 1727203963.21583: done extending task lists 7491 1727203963.21584: done processing included files 7491 1727203963.21585: results queue empty 7491 1727203963.21586: checking for any_errors_fatal 7491 1727203963.21589: done checking for any_errors_fatal 7491 1727203963.21590: checking for max_fail_percentage 7491 1727203963.21591: done checking for max_fail_percentage 7491 1727203963.21591: checking to see if all hosts have failed and the running result is not ok 7491 1727203963.21592: done checking to see if all hosts have failed 7491 1727203963.21593: getting the remaining hosts for this loop 7491 1727203963.21595: done getting the remaining hosts for this loop 7491 1727203963.21597: getting the next task for host managed-node3 7491 1727203963.21602: done getting next task for host managed-node3 7491 1727203963.21603: ^ task is: TASK: Gather current interface info 7491 1727203963.21607: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203963.21609: getting variables 7491 1727203963.21610: in VariableManager get_vars() 7491 1727203963.21627: Calling all_inventory to load vars for managed-node3 7491 1727203963.21630: Calling groups_inventory to load vars for managed-node3 7491 1727203963.21632: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203963.21637: Calling all_plugins_play to load vars for managed-node3 7491 1727203963.21639: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203963.21642: Calling groups_plugins_play to load vars for managed-node3 7491 1727203963.21882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203963.22061: done with get_vars() 7491 1727203963.22072: done getting variables 7491 1727203963.22107: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:52:43 -0400 (0:00:00.037) 0:00:05.145 ***** 7491 1727203963.22133: entering _queue_task() for managed-node3/command 7491 1727203963.22384: worker is 1 (out of 1 available) 7491 1727203963.22398: exiting _queue_task() for managed-node3/command 7491 1727203963.22411: done queuing things up, now waiting for results queue to drain 7491 1727203963.22413: waiting for pending results... 7491 1727203963.22661: running TaskExecutor() for managed-node3/TASK: Gather current interface info 7491 1727203963.22785: in run() - task 0affcd87-79f5-0a4a-ad01-0000000005b5 7491 1727203963.22805: variable 'ansible_search_path' from source: unknown 7491 1727203963.22812: variable 'ansible_search_path' from source: unknown 7491 1727203963.22851: calling self._execute() 7491 1727203963.22940: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203963.22950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203963.22972: variable 'omit' from source: magic vars 7491 1727203963.23325: variable 'ansible_distribution_major_version' from source: facts 7491 1727203963.23343: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203963.23354: variable 'omit' from source: magic vars 7491 1727203963.23410: variable 'omit' from source: magic vars 7491 1727203963.23448: variable 'omit' from source: magic vars 7491 1727203963.23493: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203963.23535: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203963.23561: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203963.23587: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203963.23605: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203963.23641: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203963.23648: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203963.23654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203963.23755: Set connection var ansible_timeout to 10 7491 1727203963.23768: Set connection var ansible_pipelining to False 7491 1727203963.23777: Set connection var ansible_shell_type to sh 7491 1727203963.23785: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203963.23794: Set connection var ansible_shell_executable to /bin/sh 7491 1727203963.23801: Set connection var ansible_connection to ssh 7491 1727203963.23824: variable 'ansible_shell_executable' from source: unknown 7491 1727203963.23834: variable 'ansible_connection' from source: unknown 7491 1727203963.23841: variable 'ansible_module_compression' from source: unknown 7491 1727203963.23845: variable 'ansible_shell_type' from source: unknown 7491 1727203963.23851: variable 'ansible_shell_executable' from source: unknown 7491 1727203963.23856: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203963.23862: variable 'ansible_pipelining' from source: unknown 7491 1727203963.23869: variable 'ansible_timeout' from source: unknown 7491 1727203963.23876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203963.24115: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203963.24174: variable 'omit' from source: magic vars 7491 1727203963.24185: starting attempt loop 7491 1727203963.24191: running the handler 7491 1727203963.24210: _low_level_execute_command(): starting 7491 1727203963.24281: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203963.25829: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203963.25848: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203963.25867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203963.25886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203963.25932: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203963.25945: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203963.25957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203963.25977: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203963.25986: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203963.25996: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203963.26007: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203963.26020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203963.26037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203963.26053: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203963.26067: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203963.26083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203963.26157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203963.26237: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203963.26254: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203963.26336: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203963.27953: stdout chunk (state=3): >>>/root <<< 7491 1727203963.28160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203963.28163: stdout chunk (state=3): >>><<< 7491 1727203963.28168: stderr chunk (state=3): >>><<< 7491 1727203963.28271: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203963.28278: _low_level_execute_command(): starting 7491 1727203963.28281: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203963.2819371-7867-29351438978507 `" && echo ansible-tmp-1727203963.2819371-7867-29351438978507="` echo /root/.ansible/tmp/ansible-tmp-1727203963.2819371-7867-29351438978507 `" ) && sleep 0' 7491 1727203963.30288: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203963.30293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203963.30320: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203963.30324: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203963.30326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203963.30408: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203963.30411: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203963.30416: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203963.30473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203963.32291: stdout chunk (state=3): >>>ansible-tmp-1727203963.2819371-7867-29351438978507=/root/.ansible/tmp/ansible-tmp-1727203963.2819371-7867-29351438978507 <<< 7491 1727203963.32461: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203963.32468: stdout chunk (state=3): >>><<< 7491 1727203963.32475: stderr chunk (state=3): >>><<< 7491 1727203963.32499: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203963.2819371-7867-29351438978507=/root/.ansible/tmp/ansible-tmp-1727203963.2819371-7867-29351438978507 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203963.32535: variable 'ansible_module_compression' from source: unknown 7491 1727203963.32593: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7491 1727203963.32630: variable 'ansible_facts' from source: unknown 7491 1727203963.32720: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203963.2819371-7867-29351438978507/AnsiballZ_command.py 7491 1727203963.33115: Sending initial data 7491 1727203963.33150: Sent initial data (153 bytes) 7491 1727203963.34225: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203963.34246: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203963.34263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203963.34287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203963.34330: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203963.34343: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203963.34363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203963.34384: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203963.34398: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203963.34409: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203963.34421: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203963.34436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203963.34451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203963.34469: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203963.34484: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203963.34498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203963.34576: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203963.34602: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203963.34619: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203963.34697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203963.36410: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203963.36447: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203963.36484: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpo4ri9k05 /root/.ansible/tmp/ansible-tmp-1727203963.2819371-7867-29351438978507/AnsiballZ_command.py <<< 7491 1727203963.36521: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203963.37949: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203963.38040: stderr chunk (state=3): >>><<< 7491 1727203963.38046: stdout chunk (state=3): >>><<< 7491 1727203963.38075: done transferring module to remote 7491 1727203963.38090: _low_level_execute_command(): starting 7491 1727203963.38095: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203963.2819371-7867-29351438978507/ /root/.ansible/tmp/ansible-tmp-1727203963.2819371-7867-29351438978507/AnsiballZ_command.py && sleep 0' 7491 1727203963.38850: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203963.38853: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203963.38876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203963.38915: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203963.38918: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203963.38921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203963.38973: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203963.38981: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203963.38988: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203963.39042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203963.40796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203963.40981: stderr chunk (state=3): >>><<< 7491 1727203963.40985: stdout chunk (state=3): >>><<< 7491 1727203963.41006: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203963.41009: _low_level_execute_command(): starting 7491 1727203963.41015: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203963.2819371-7867-29351438978507/AnsiballZ_command.py && sleep 0' 7491 1727203963.41983: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203963.41999: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203963.42015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203963.42034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203963.42079: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203963.42093: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203963.42108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203963.42127: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203963.42141: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203963.42154: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203963.42171: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203963.42186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203963.42202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203963.42214: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203963.42224: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203963.42236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203963.42312: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203963.42334: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203963.42350: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203963.42428: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203963.55694: stdout chunk (state=3): >>> {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:52:43.552960", "end": "2024-09-24 14:52:43.556261", "delta": "0:00:00.003301", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7491 1727203963.56953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203963.56958: stderr chunk (state=3): >>><<< 7491 1727203963.56960: stdout chunk (state=3): >>><<< 7491 1727203963.56986: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:52:43.552960", "end": "2024-09-24 14:52:43.556261", "delta": "0:00:00.003301", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203963.57032: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203963.2819371-7867-29351438978507/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203963.57040: _low_level_execute_command(): starting 7491 1727203963.57046: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203963.2819371-7867-29351438978507/ > /dev/null 2>&1 && sleep 0' 7491 1727203963.57779: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203963.57817: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203963.57821: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203963.57823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203963.57885: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203963.57890: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203963.57893: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203963.57930: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203963.60176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203963.60180: stderr chunk (state=3): >>><<< 7491 1727203963.60182: stdout chunk (state=3): >>><<< 7491 1727203963.60186: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203963.60188: handler run complete 7491 1727203963.60190: Evaluated conditional (False): False 7491 1727203963.60192: attempt loop complete, returning result 7491 1727203963.60194: _execute() done 7491 1727203963.60196: dumping result to json 7491 1727203963.60197: done dumping result, returning 7491 1727203963.60199: done running TaskExecutor() for managed-node3/TASK: Gather current interface info [0affcd87-79f5-0a4a-ad01-0000000005b5] 7491 1727203963.60201: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000005b5 7491 1727203963.60282: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000005b5 7491 1727203963.60286: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003301", "end": "2024-09-24 14:52:43.556261", "rc": 0, "start": "2024-09-24 14:52:43.552960" } STDOUT: eth0 lo 7491 1727203963.60372: no more pending results, returning what we have 7491 1727203963.60375: results queue empty 7491 1727203963.60376: checking for any_errors_fatal 7491 1727203963.60377: done checking for any_errors_fatal 7491 1727203963.60377: checking for max_fail_percentage 7491 1727203963.60379: done checking for max_fail_percentage 7491 1727203963.60379: checking to see if all hosts have failed and the running result is not ok 7491 1727203963.60380: done checking to see if all hosts have failed 7491 1727203963.60380: getting the remaining hosts for this loop 7491 1727203963.60381: done getting the remaining hosts for this loop 7491 1727203963.60384: getting the next task for host managed-node3 7491 1727203963.60390: done getting next task for host managed-node3 7491 1727203963.60392: ^ task is: TASK: Set current_interfaces 7491 1727203963.60397: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203963.60404: getting variables 7491 1727203963.60406: in VariableManager get_vars() 7491 1727203963.60456: Calling all_inventory to load vars for managed-node3 7491 1727203963.60460: Calling groups_inventory to load vars for managed-node3 7491 1727203963.60462: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203963.60473: Calling all_plugins_play to load vars for managed-node3 7491 1727203963.60476: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203963.60479: Calling groups_plugins_play to load vars for managed-node3 7491 1727203963.60647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203963.60848: done with get_vars() 7491 1727203963.60861: done getting variables 7491 1727203963.60925: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:52:43 -0400 (0:00:00.388) 0:00:05.533 ***** 7491 1727203963.60957: entering _queue_task() for managed-node3/set_fact 7491 1727203963.61214: worker is 1 (out of 1 available) 7491 1727203963.61228: exiting _queue_task() for managed-node3/set_fact 7491 1727203963.61240: done queuing things up, now waiting for results queue to drain 7491 1727203963.61242: waiting for pending results... 7491 1727203963.61607: running TaskExecutor() for managed-node3/TASK: Set current_interfaces 7491 1727203963.61722: in run() - task 0affcd87-79f5-0a4a-ad01-0000000005b6 7491 1727203963.61732: variable 'ansible_search_path' from source: unknown 7491 1727203963.61736: variable 'ansible_search_path' from source: unknown 7491 1727203963.61768: calling self._execute() 7491 1727203963.61836: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203963.61840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203963.61847: variable 'omit' from source: magic vars 7491 1727203963.62173: variable 'ansible_distribution_major_version' from source: facts 7491 1727203963.62183: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203963.62189: variable 'omit' from source: magic vars 7491 1727203963.62220: variable 'omit' from source: magic vars 7491 1727203963.62300: variable '_current_interfaces' from source: set_fact 7491 1727203963.62347: variable 'omit' from source: magic vars 7491 1727203963.62379: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203963.62406: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203963.62423: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203963.62438: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203963.62446: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203963.62473: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203963.62476: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203963.62479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203963.62547: Set connection var ansible_timeout to 10 7491 1727203963.62555: Set connection var ansible_pipelining to False 7491 1727203963.62563: Set connection var ansible_shell_type to sh 7491 1727203963.62570: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203963.62577: Set connection var ansible_shell_executable to /bin/sh 7491 1727203963.62581: Set connection var ansible_connection to ssh 7491 1727203963.62599: variable 'ansible_shell_executable' from source: unknown 7491 1727203963.62602: variable 'ansible_connection' from source: unknown 7491 1727203963.62604: variable 'ansible_module_compression' from source: unknown 7491 1727203963.62607: variable 'ansible_shell_type' from source: unknown 7491 1727203963.62609: variable 'ansible_shell_executable' from source: unknown 7491 1727203963.62611: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203963.62615: variable 'ansible_pipelining' from source: unknown 7491 1727203963.62617: variable 'ansible_timeout' from source: unknown 7491 1727203963.62624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203963.62726: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203963.62734: variable 'omit' from source: magic vars 7491 1727203963.62739: starting attempt loop 7491 1727203963.62742: running the handler 7491 1727203963.62751: handler run complete 7491 1727203963.62759: attempt loop complete, returning result 7491 1727203963.62762: _execute() done 7491 1727203963.62765: dumping result to json 7491 1727203963.62772: done dumping result, returning 7491 1727203963.62782: done running TaskExecutor() for managed-node3/TASK: Set current_interfaces [0affcd87-79f5-0a4a-ad01-0000000005b6] 7491 1727203963.62786: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000005b6 7491 1727203963.62860: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000005b6 7491 1727203963.62863: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "current_interfaces": [ "eth0", "lo" ] }, "changed": false } 7491 1727203963.62932: no more pending results, returning what we have 7491 1727203963.62935: results queue empty 7491 1727203963.62936: checking for any_errors_fatal 7491 1727203963.62943: done checking for any_errors_fatal 7491 1727203963.62944: checking for max_fail_percentage 7491 1727203963.62945: done checking for max_fail_percentage 7491 1727203963.62946: checking to see if all hosts have failed and the running result is not ok 7491 1727203963.62947: done checking to see if all hosts have failed 7491 1727203963.62948: getting the remaining hosts for this loop 7491 1727203963.62949: done getting the remaining hosts for this loop 7491 1727203963.62952: getting the next task for host managed-node3 7491 1727203963.62959: done getting next task for host managed-node3 7491 1727203963.62962: ^ task is: TASK: Show current_interfaces 7491 1727203963.62967: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203963.62970: getting variables 7491 1727203963.62971: in VariableManager get_vars() 7491 1727203963.63014: Calling all_inventory to load vars for managed-node3 7491 1727203963.63018: Calling groups_inventory to load vars for managed-node3 7491 1727203963.63020: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203963.63027: Calling all_plugins_play to load vars for managed-node3 7491 1727203963.63029: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203963.63031: Calling groups_plugins_play to load vars for managed-node3 7491 1727203963.63171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203963.63289: done with get_vars() 7491 1727203963.63297: done getting variables 7491 1727203963.63373: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:52:43 -0400 (0:00:00.024) 0:00:05.557 ***** 7491 1727203963.63402: entering _queue_task() for managed-node3/debug 7491 1727203963.63666: worker is 1 (out of 1 available) 7491 1727203963.63680: exiting _queue_task() for managed-node3/debug 7491 1727203963.63693: done queuing things up, now waiting for results queue to drain 7491 1727203963.63695: waiting for pending results... 7491 1727203963.63959: running TaskExecutor() for managed-node3/TASK: Show current_interfaces 7491 1727203963.64071: in run() - task 0affcd87-79f5-0a4a-ad01-00000000057f 7491 1727203963.64090: variable 'ansible_search_path' from source: unknown 7491 1727203963.64097: variable 'ansible_search_path' from source: unknown 7491 1727203963.64141: calling self._execute() 7491 1727203963.64263: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203963.64277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203963.64305: variable 'omit' from source: magic vars 7491 1727203963.64865: variable 'ansible_distribution_major_version' from source: facts 7491 1727203963.64891: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203963.64903: variable 'omit' from source: magic vars 7491 1727203963.64966: variable 'omit' from source: magic vars 7491 1727203963.65089: variable 'current_interfaces' from source: set_fact 7491 1727203963.65138: variable 'omit' from source: magic vars 7491 1727203963.65186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203963.65226: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203963.65253: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203963.65278: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203963.65293: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203963.65330: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203963.65339: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203963.65346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203963.65457: Set connection var ansible_timeout to 10 7491 1727203963.65472: Set connection var ansible_pipelining to False 7491 1727203963.65489: Set connection var ansible_shell_type to sh 7491 1727203963.65501: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203963.65512: Set connection var ansible_shell_executable to /bin/sh 7491 1727203963.65525: Set connection var ansible_connection to ssh 7491 1727203963.65551: variable 'ansible_shell_executable' from source: unknown 7491 1727203963.65558: variable 'ansible_connection' from source: unknown 7491 1727203963.65566: variable 'ansible_module_compression' from source: unknown 7491 1727203963.65573: variable 'ansible_shell_type' from source: unknown 7491 1727203963.65578: variable 'ansible_shell_executable' from source: unknown 7491 1727203963.65703: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203963.65712: variable 'ansible_pipelining' from source: unknown 7491 1727203963.65722: variable 'ansible_timeout' from source: unknown 7491 1727203963.65728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203963.65878: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203963.65894: variable 'omit' from source: magic vars 7491 1727203963.65904: starting attempt loop 7491 1727203963.65910: running the handler 7491 1727203963.65963: handler run complete 7491 1727203963.65984: attempt loop complete, returning result 7491 1727203963.65992: _execute() done 7491 1727203963.66000: dumping result to json 7491 1727203963.66008: done dumping result, returning 7491 1727203963.66024: done running TaskExecutor() for managed-node3/TASK: Show current_interfaces [0affcd87-79f5-0a4a-ad01-00000000057f] 7491 1727203963.66037: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000057f ok: [managed-node3] => {} MSG: current_interfaces: ['eth0', 'lo'] 7491 1727203963.66186: no more pending results, returning what we have 7491 1727203963.66190: results queue empty 7491 1727203963.66191: checking for any_errors_fatal 7491 1727203963.66196: done checking for any_errors_fatal 7491 1727203963.66197: checking for max_fail_percentage 7491 1727203963.66199: done checking for max_fail_percentage 7491 1727203963.66200: checking to see if all hosts have failed and the running result is not ok 7491 1727203963.66201: done checking to see if all hosts have failed 7491 1727203963.66202: getting the remaining hosts for this loop 7491 1727203963.66204: done getting the remaining hosts for this loop 7491 1727203963.66208: getting the next task for host managed-node3 7491 1727203963.66221: done getting next task for host managed-node3 7491 1727203963.66224: ^ task is: TASK: Install iproute 7491 1727203963.66228: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203963.66233: getting variables 7491 1727203963.66235: in VariableManager get_vars() 7491 1727203963.66287: Calling all_inventory to load vars for managed-node3 7491 1727203963.66290: Calling groups_inventory to load vars for managed-node3 7491 1727203963.66293: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203963.66305: Calling all_plugins_play to load vars for managed-node3 7491 1727203963.66308: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203963.66312: Calling groups_plugins_play to load vars for managed-node3 7491 1727203963.66547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203963.66772: done with get_vars() 7491 1727203963.66786: done getting variables 7491 1727203963.66848: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203963.67135: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000057f 7491 1727203963.67139: WORKER PROCESS EXITING TASK [Install iproute] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 14:52:43 -0400 (0:00:00.037) 0:00:05.595 ***** 7491 1727203963.67153: entering _queue_task() for managed-node3/package 7491 1727203963.67393: worker is 1 (out of 1 available) 7491 1727203963.67407: exiting _queue_task() for managed-node3/package 7491 1727203963.67423: done queuing things up, now waiting for results queue to drain 7491 1727203963.67424: waiting for pending results... 7491 1727203963.67665: running TaskExecutor() for managed-node3/TASK: Install iproute 7491 1727203963.67771: in run() - task 0affcd87-79f5-0a4a-ad01-0000000003a8 7491 1727203963.67791: variable 'ansible_search_path' from source: unknown 7491 1727203963.67799: variable 'ansible_search_path' from source: unknown 7491 1727203963.67846: calling self._execute() 7491 1727203963.67946: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203963.67956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203963.67979: variable 'omit' from source: magic vars 7491 1727203963.68663: variable 'ansible_distribution_major_version' from source: facts 7491 1727203963.68684: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203963.69026: variable 'omit' from source: magic vars 7491 1727203963.69102: variable 'omit' from source: magic vars 7491 1727203963.69321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203963.71809: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203963.71902: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203963.71951: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203963.72001: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203963.72035: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203963.72136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203963.72179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203963.72211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203963.72261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203963.72286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203963.72410: variable '__network_is_ostree' from source: set_fact 7491 1727203963.72426: variable 'omit' from source: magic vars 7491 1727203963.72461: variable 'omit' from source: magic vars 7491 1727203963.72504: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203963.72539: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203963.72562: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203963.72585: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203963.72604: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203963.72640: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203963.72648: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203963.72655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203963.72761: Set connection var ansible_timeout to 10 7491 1727203963.72777: Set connection var ansible_pipelining to False 7491 1727203963.72787: Set connection var ansible_shell_type to sh 7491 1727203963.72799: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203963.72813: Set connection var ansible_shell_executable to /bin/sh 7491 1727203963.72828: Set connection var ansible_connection to ssh 7491 1727203963.72858: variable 'ansible_shell_executable' from source: unknown 7491 1727203963.72869: variable 'ansible_connection' from source: unknown 7491 1727203963.72877: variable 'ansible_module_compression' from source: unknown 7491 1727203963.72884: variable 'ansible_shell_type' from source: unknown 7491 1727203963.72890: variable 'ansible_shell_executable' from source: unknown 7491 1727203963.72897: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203963.72915: variable 'ansible_pipelining' from source: unknown 7491 1727203963.72929: variable 'ansible_timeout' from source: unknown 7491 1727203963.72941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203963.73054: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203963.73072: variable 'omit' from source: magic vars 7491 1727203963.73081: starting attempt loop 7491 1727203963.73087: running the handler 7491 1727203963.73097: variable 'ansible_facts' from source: unknown 7491 1727203963.73103: variable 'ansible_facts' from source: unknown 7491 1727203963.73147: _low_level_execute_command(): starting 7491 1727203963.73159: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203963.73888: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203963.73907: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203963.73925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203963.73971: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203963.73976: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203963.73987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203963.74038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203963.74045: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203963.74091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203963.75710: stdout chunk (state=3): >>>/root <<< 7491 1727203963.75877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203963.75912: stderr chunk (state=3): >>><<< 7491 1727203963.75919: stdout chunk (state=3): >>><<< 7491 1727203963.76046: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203963.76049: _low_level_execute_command(): starting 7491 1727203963.76052: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203963.7594752-7962-19000945376439 `" && echo ansible-tmp-1727203963.7594752-7962-19000945376439="` echo /root/.ansible/tmp/ansible-tmp-1727203963.7594752-7962-19000945376439 `" ) && sleep 0' 7491 1727203963.76676: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203963.76699: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203963.76704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203963.76736: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203963.76740: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203963.76742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203963.76819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203963.76823: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203963.76858: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203963.78652: stdout chunk (state=3): >>>ansible-tmp-1727203963.7594752-7962-19000945376439=/root/.ansible/tmp/ansible-tmp-1727203963.7594752-7962-19000945376439 <<< 7491 1727203963.78800: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203963.78845: stderr chunk (state=3): >>><<< 7491 1727203963.78848: stdout chunk (state=3): >>><<< 7491 1727203963.78863: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203963.7594752-7962-19000945376439=/root/.ansible/tmp/ansible-tmp-1727203963.7594752-7962-19000945376439 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203963.78898: variable 'ansible_module_compression' from source: unknown 7491 1727203963.78946: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 7491 1727203963.78950: ANSIBALLZ: Acquiring lock 7491 1727203963.78952: ANSIBALLZ: Lock acquired: 139674606106048 7491 1727203963.78954: ANSIBALLZ: Creating module 7491 1727203963.95428: ANSIBALLZ: Writing module into payload 7491 1727203963.95625: ANSIBALLZ: Writing module 7491 1727203963.95647: ANSIBALLZ: Renaming module 7491 1727203963.95656: ANSIBALLZ: Done creating module 7491 1727203963.95673: variable 'ansible_facts' from source: unknown 7491 1727203963.95733: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203963.7594752-7962-19000945376439/AnsiballZ_dnf.py 7491 1727203963.95843: Sending initial data 7491 1727203963.95853: Sent initial data (149 bytes) 7491 1727203963.96773: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203963.96788: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203963.96803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203963.96823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203963.96869: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203963.96881: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203963.96895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203963.96911: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203963.96925: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203963.96937: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203963.96949: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203963.96962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203963.96981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203963.96993: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203963.97007: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203963.97025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203963.97262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203963.97286: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203963.97387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203963.99177: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203963.99238: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203963.99262: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmps0_s2gl2 /root/.ansible/tmp/ansible-tmp-1727203963.7594752-7962-19000945376439/AnsiballZ_dnf.py <<< 7491 1727203963.99276: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203964.01502: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203964.01641: stderr chunk (state=3): >>><<< 7491 1727203964.01645: stdout chunk (state=3): >>><<< 7491 1727203964.01647: done transferring module to remote 7491 1727203964.01649: _low_level_execute_command(): starting 7491 1727203964.01652: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203963.7594752-7962-19000945376439/ /root/.ansible/tmp/ansible-tmp-1727203963.7594752-7962-19000945376439/AnsiballZ_dnf.py && sleep 0' 7491 1727203964.03466: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203964.03485: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203964.03500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203964.03523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203964.03572: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203964.03588: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203964.03605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203964.03628: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203964.03642: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203964.03654: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203964.03668: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203964.03684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203964.03700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203964.03713: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203964.03728: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203964.03742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203964.03822: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203964.03893: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203964.03911: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203964.04039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203964.05784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203964.05868: stderr chunk (state=3): >>><<< 7491 1727203964.05872: stdout chunk (state=3): >>><<< 7491 1727203964.05970: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203964.05973: _low_level_execute_command(): starting 7491 1727203964.05976: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203963.7594752-7962-19000945376439/AnsiballZ_dnf.py && sleep 0' 7491 1727203964.07386: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203964.07401: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203964.07883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203964.08048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203964.08095: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203964.08109: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203964.08128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203964.08147: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203964.08160: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203964.08175: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203964.08189: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203964.08206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203964.08225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203964.08237: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203964.08249: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203964.08263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203964.08342: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203964.08380: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203964.08400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203964.08485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203967.48127: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 7491 1727203967.52337: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203967.52395: stderr chunk (state=3): >>><<< 7491 1727203967.52399: stdout chunk (state=3): >>><<< 7491 1727203967.52415: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203967.52453: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203963.7594752-7962-19000945376439/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203967.52468: _low_level_execute_command(): starting 7491 1727203967.52471: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203963.7594752-7962-19000945376439/ > /dev/null 2>&1 && sleep 0' 7491 1727203967.52946: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203967.52950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203967.52975: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203967.52989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203967.52992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203967.53047: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203967.53050: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203967.53053: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203967.53099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203967.54868: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203967.54923: stderr chunk (state=3): >>><<< 7491 1727203967.54926: stdout chunk (state=3): >>><<< 7491 1727203967.54939: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203967.54946: handler run complete 7491 1727203967.55065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727203967.55189: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727203967.55222: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727203967.55243: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727203967.55268: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727203967.55324: variable '__install_status' from source: unknown 7491 1727203967.55336: Evaluated conditional (__install_status is success): True 7491 1727203967.55348: attempt loop complete, returning result 7491 1727203967.55351: _execute() done 7491 1727203967.55353: dumping result to json 7491 1727203967.55359: done dumping result, returning 7491 1727203967.55367: done running TaskExecutor() for managed-node3/TASK: Install iproute [0affcd87-79f5-0a4a-ad01-0000000003a8] 7491 1727203967.55373: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000003a8 7491 1727203967.55475: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000003a8 7491 1727203967.55478: WORKER PROCESS EXITING ok: [managed-node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 7491 1727203967.55637: no more pending results, returning what we have 7491 1727203967.55640: results queue empty 7491 1727203967.55641: checking for any_errors_fatal 7491 1727203967.55645: done checking for any_errors_fatal 7491 1727203967.55646: checking for max_fail_percentage 7491 1727203967.55648: done checking for max_fail_percentage 7491 1727203967.55648: checking to see if all hosts have failed and the running result is not ok 7491 1727203967.55654: done checking to see if all hosts have failed 7491 1727203967.55654: getting the remaining hosts for this loop 7491 1727203967.55656: done getting the remaining hosts for this loop 7491 1727203967.55659: getting the next task for host managed-node3 7491 1727203967.55665: done getting next task for host managed-node3 7491 1727203967.55667: ^ task is: TASK: Create veth interface {{ interface }} 7491 1727203967.55670: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203967.55673: getting variables 7491 1727203967.55674: in VariableManager get_vars() 7491 1727203967.55715: Calling all_inventory to load vars for managed-node3 7491 1727203967.55720: Calling groups_inventory to load vars for managed-node3 7491 1727203967.55722: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203967.55731: Calling all_plugins_play to load vars for managed-node3 7491 1727203967.55734: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203967.55736: Calling groups_plugins_play to load vars for managed-node3 7491 1727203967.55856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203967.55977: done with get_vars() 7491 1727203967.55985: done getting variables 7491 1727203967.56031: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203967.56123: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 14:52:47 -0400 (0:00:03.890) 0:00:09.485 ***** 7491 1727203967.56155: entering _queue_task() for managed-node3/command 7491 1727203967.56342: worker is 1 (out of 1 available) 7491 1727203967.56355: exiting _queue_task() for managed-node3/command 7491 1727203967.56370: done queuing things up, now waiting for results queue to drain 7491 1727203967.56371: waiting for pending results... 7491 1727203967.56526: running TaskExecutor() for managed-node3/TASK: Create veth interface veth0 7491 1727203967.56597: in run() - task 0affcd87-79f5-0a4a-ad01-0000000003a9 7491 1727203967.56611: variable 'ansible_search_path' from source: unknown 7491 1727203967.56615: variable 'ansible_search_path' from source: unknown 7491 1727203967.56825: variable 'interface' from source: play vars 7491 1727203967.56890: variable 'interface' from source: play vars 7491 1727203967.56943: variable 'interface' from source: play vars 7491 1727203967.57056: Loaded config def from plugin (lookup/items) 7491 1727203967.57061: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 7491 1727203967.57081: variable 'omit' from source: magic vars 7491 1727203967.57174: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203967.57181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203967.57190: variable 'omit' from source: magic vars 7491 1727203967.57351: variable 'ansible_distribution_major_version' from source: facts 7491 1727203967.57358: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203967.57490: variable 'type' from source: play vars 7491 1727203967.57494: variable 'state' from source: include params 7491 1727203967.57497: variable 'interface' from source: play vars 7491 1727203967.57502: variable 'current_interfaces' from source: set_fact 7491 1727203967.57508: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 7491 1727203967.57516: variable 'omit' from source: magic vars 7491 1727203967.57543: variable 'omit' from source: magic vars 7491 1727203967.57575: variable 'item' from source: unknown 7491 1727203967.57625: variable 'item' from source: unknown 7491 1727203967.57637: variable 'omit' from source: magic vars 7491 1727203967.57664: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203967.57687: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203967.57703: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203967.57744: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203967.57754: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203967.57779: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203967.57782: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203967.57785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203967.57852: Set connection var ansible_timeout to 10 7491 1727203967.57856: Set connection var ansible_pipelining to False 7491 1727203967.57861: Set connection var ansible_shell_type to sh 7491 1727203967.57868: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203967.57878: Set connection var ansible_shell_executable to /bin/sh 7491 1727203967.57886: Set connection var ansible_connection to ssh 7491 1727203967.57902: variable 'ansible_shell_executable' from source: unknown 7491 1727203967.57905: variable 'ansible_connection' from source: unknown 7491 1727203967.57907: variable 'ansible_module_compression' from source: unknown 7491 1727203967.57909: variable 'ansible_shell_type' from source: unknown 7491 1727203967.57911: variable 'ansible_shell_executable' from source: unknown 7491 1727203967.57913: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203967.57919: variable 'ansible_pipelining' from source: unknown 7491 1727203967.57922: variable 'ansible_timeout' from source: unknown 7491 1727203967.57927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203967.58029: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203967.58036: variable 'omit' from source: magic vars 7491 1727203967.58145: starting attempt loop 7491 1727203967.58148: running the handler 7491 1727203967.58150: _low_level_execute_command(): starting 7491 1727203967.58153: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203967.58787: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203967.58801: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203967.58811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203967.58831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203967.58872: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203967.58880: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203967.58891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203967.58906: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203967.58915: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203967.58924: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203967.58932: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203967.58943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203967.58955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203967.58968: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203967.58975: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203967.58986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203967.59060: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203967.59078: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203967.59087: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203967.59176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203967.60727: stdout chunk (state=3): >>>/root <<< 7491 1727203967.60828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203967.60883: stderr chunk (state=3): >>><<< 7491 1727203967.60887: stdout chunk (state=3): >>><<< 7491 1727203967.60907: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203967.60923: _low_level_execute_command(): starting 7491 1727203967.60927: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203967.6090763-8229-29019150899027 `" && echo ansible-tmp-1727203967.6090763-8229-29019150899027="` echo /root/.ansible/tmp/ansible-tmp-1727203967.6090763-8229-29019150899027 `" ) && sleep 0' 7491 1727203967.61378: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203967.61384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203967.61438: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203967.61441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203967.61444: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203967.61447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203967.61504: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203967.61512: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203967.61515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203967.61556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203967.63389: stdout chunk (state=3): >>>ansible-tmp-1727203967.6090763-8229-29019150899027=/root/.ansible/tmp/ansible-tmp-1727203967.6090763-8229-29019150899027 <<< 7491 1727203967.63494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203967.63563: stderr chunk (state=3): >>><<< 7491 1727203967.63568: stdout chunk (state=3): >>><<< 7491 1727203967.63586: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203967.6090763-8229-29019150899027=/root/.ansible/tmp/ansible-tmp-1727203967.6090763-8229-29019150899027 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203967.63614: variable 'ansible_module_compression' from source: unknown 7491 1727203967.63658: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7491 1727203967.63689: variable 'ansible_facts' from source: unknown 7491 1727203967.63748: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203967.6090763-8229-29019150899027/AnsiballZ_command.py 7491 1727203967.63862: Sending initial data 7491 1727203967.63872: Sent initial data (153 bytes) 7491 1727203967.64577: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203967.64581: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203967.64618: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203967.64623: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203967.64625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203967.64677: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203967.64686: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203967.64740: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203967.66418: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203967.66453: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203967.66491: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmp647aydbk /root/.ansible/tmp/ansible-tmp-1727203967.6090763-8229-29019150899027/AnsiballZ_command.py <<< 7491 1727203967.66528: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203967.67329: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203967.67446: stderr chunk (state=3): >>><<< 7491 1727203967.67450: stdout chunk (state=3): >>><<< 7491 1727203967.67466: done transferring module to remote 7491 1727203967.67475: _low_level_execute_command(): starting 7491 1727203967.67480: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203967.6090763-8229-29019150899027/ /root/.ansible/tmp/ansible-tmp-1727203967.6090763-8229-29019150899027/AnsiballZ_command.py && sleep 0' 7491 1727203967.67942: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203967.67955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203967.67975: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203967.68003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203967.68042: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203967.68054: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203967.68102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203967.69744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203967.69794: stderr chunk (state=3): >>><<< 7491 1727203967.69798: stdout chunk (state=3): >>><<< 7491 1727203967.69814: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203967.69817: _low_level_execute_command(): starting 7491 1727203967.69828: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203967.6090763-8229-29019150899027/AnsiballZ_command.py && sleep 0' 7491 1727203967.70287: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203967.70290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203967.70326: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203967.70329: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203967.70332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203967.70380: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203967.70396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203967.70439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203967.85105: stdout chunk (state=3): >>> <<< 7491 1727203967.85120: stdout chunk (state=3): >>>{"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-24 14:52:47.832907", "end": "2024-09-24 14:52:47.850020", "delta": "0:00:00.017113", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7491 1727203967.86948: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203967.87010: stderr chunk (state=3): >>><<< 7491 1727203967.87013: stdout chunk (state=3): >>><<< 7491 1727203967.87033: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-24 14:52:47.832907", "end": "2024-09-24 14:52:47.850020", "delta": "0:00:00.017113", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203967.87065: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add veth0 type veth peer name peerveth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203967.6090763-8229-29019150899027/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203967.87073: _low_level_execute_command(): starting 7491 1727203967.87078: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203967.6090763-8229-29019150899027/ > /dev/null 2>&1 && sleep 0' 7491 1727203967.87551: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203967.87579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203967.87592: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 7491 1727203967.87606: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203967.87654: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203967.87668: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203967.87924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203967.91327: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203967.91383: stderr chunk (state=3): >>><<< 7491 1727203967.91387: stdout chunk (state=3): >>><<< 7491 1727203967.91401: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203967.91407: handler run complete 7491 1727203967.91431: Evaluated conditional (False): False 7491 1727203967.91439: attempt loop complete, returning result 7491 1727203967.91454: variable 'item' from source: unknown 7491 1727203967.91520: variable 'item' from source: unknown ok: [managed-node3] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0" ], "delta": "0:00:00.017113", "end": "2024-09-24 14:52:47.850020", "item": "ip link add veth0 type veth peer name peerveth0", "rc": 0, "start": "2024-09-24 14:52:47.832907" } 7491 1727203967.91692: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203967.91695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203967.91698: variable 'omit' from source: magic vars 7491 1727203967.91776: variable 'ansible_distribution_major_version' from source: facts 7491 1727203967.91779: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203967.91902: variable 'type' from source: play vars 7491 1727203967.91906: variable 'state' from source: include params 7491 1727203967.91908: variable 'interface' from source: play vars 7491 1727203967.91913: variable 'current_interfaces' from source: set_fact 7491 1727203967.91925: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 7491 1727203967.91928: variable 'omit' from source: magic vars 7491 1727203967.91940: variable 'omit' from source: magic vars 7491 1727203967.91968: variable 'item' from source: unknown 7491 1727203967.92013: variable 'item' from source: unknown 7491 1727203967.92033: variable 'omit' from source: magic vars 7491 1727203967.92046: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203967.92055: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203967.92061: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203967.92074: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203967.92077: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203967.92079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203967.92139: Set connection var ansible_timeout to 10 7491 1727203967.92145: Set connection var ansible_pipelining to False 7491 1727203967.92147: Set connection var ansible_shell_type to sh 7491 1727203967.92150: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203967.92152: Set connection var ansible_shell_executable to /bin/sh 7491 1727203967.92157: Set connection var ansible_connection to ssh 7491 1727203967.92173: variable 'ansible_shell_executable' from source: unknown 7491 1727203967.92175: variable 'ansible_connection' from source: unknown 7491 1727203967.92178: variable 'ansible_module_compression' from source: unknown 7491 1727203967.92180: variable 'ansible_shell_type' from source: unknown 7491 1727203967.92182: variable 'ansible_shell_executable' from source: unknown 7491 1727203967.92184: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203967.92189: variable 'ansible_pipelining' from source: unknown 7491 1727203967.92191: variable 'ansible_timeout' from source: unknown 7491 1727203967.92195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203967.92267: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203967.92275: variable 'omit' from source: magic vars 7491 1727203967.92279: starting attempt loop 7491 1727203967.92282: running the handler 7491 1727203967.92289: _low_level_execute_command(): starting 7491 1727203967.92293: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203967.92759: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203967.92774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203967.92804: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203967.92819: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203967.92860: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203967.92876: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203967.92939: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203967.94476: stdout chunk (state=3): >>>/root <<< 7491 1727203967.94583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203967.94645: stderr chunk (state=3): >>><<< 7491 1727203967.94649: stdout chunk (state=3): >>><<< 7491 1727203967.94663: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203967.94674: _low_level_execute_command(): starting 7491 1727203967.94679: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203967.9466584-8229-158555831893048 `" && echo ansible-tmp-1727203967.9466584-8229-158555831893048="` echo /root/.ansible/tmp/ansible-tmp-1727203967.9466584-8229-158555831893048 `" ) && sleep 0' 7491 1727203967.95156: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203967.95178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203967.95200: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203967.95211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203967.95224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203967.95265: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203967.95282: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203967.95334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203967.97144: stdout chunk (state=3): >>>ansible-tmp-1727203967.9466584-8229-158555831893048=/root/.ansible/tmp/ansible-tmp-1727203967.9466584-8229-158555831893048 <<< 7491 1727203967.97254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203967.97312: stderr chunk (state=3): >>><<< 7491 1727203967.97315: stdout chunk (state=3): >>><<< 7491 1727203967.97333: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203967.9466584-8229-158555831893048=/root/.ansible/tmp/ansible-tmp-1727203967.9466584-8229-158555831893048 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203967.97354: variable 'ansible_module_compression' from source: unknown 7491 1727203967.97386: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7491 1727203967.97402: variable 'ansible_facts' from source: unknown 7491 1727203967.97456: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203967.9466584-8229-158555831893048/AnsiballZ_command.py 7491 1727203967.97559: Sending initial data 7491 1727203967.97571: Sent initial data (154 bytes) 7491 1727203967.98253: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203967.98256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203967.98296: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203967.98299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203967.98301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203967.98351: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203967.98354: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203967.98405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203968.00079: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203968.00122: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203968.00155: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmplejyzqj1 /root/.ansible/tmp/ansible-tmp-1727203967.9466584-8229-158555831893048/AnsiballZ_command.py <<< 7491 1727203968.00190: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203968.00977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203968.01088: stderr chunk (state=3): >>><<< 7491 1727203968.01091: stdout chunk (state=3): >>><<< 7491 1727203968.01108: done transferring module to remote 7491 1727203968.01116: _low_level_execute_command(): starting 7491 1727203968.01123: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203967.9466584-8229-158555831893048/ /root/.ansible/tmp/ansible-tmp-1727203967.9466584-8229-158555831893048/AnsiballZ_command.py && sleep 0' 7491 1727203968.01587: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203968.01601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203968.01626: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203968.01643: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203968.01685: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203968.01697: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203968.01748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203968.03422: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203968.03480: stderr chunk (state=3): >>><<< 7491 1727203968.03485: stdout chunk (state=3): >>><<< 7491 1727203968.03500: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203968.03503: _low_level_execute_command(): starting 7491 1727203968.03508: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203967.9466584-8229-158555831893048/AnsiballZ_command.py && sleep 0' 7491 1727203968.03967: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203968.03986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203968.04001: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203968.04019: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203968.04062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203968.04077: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203968.04133: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203968.17561: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-24 14:52:48.171427", "end": "2024-09-24 14:52:48.174833", "delta": "0:00:00.003406", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7491 1727203968.18704: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203968.18773: stderr chunk (state=3): >>><<< 7491 1727203968.18779: stdout chunk (state=3): >>><<< 7491 1727203968.18794: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-24 14:52:48.171427", "end": "2024-09-24 14:52:48.174833", "delta": "0:00:00.003406", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203968.18822: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerveth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203967.9466584-8229-158555831893048/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203968.18832: _low_level_execute_command(): starting 7491 1727203968.18836: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203967.9466584-8229-158555831893048/ > /dev/null 2>&1 && sleep 0' 7491 1727203968.19321: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203968.19326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203968.19358: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203968.19363: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203968.19365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203968.19422: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203968.19426: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203968.19428: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203968.19478: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203968.21215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203968.21272: stderr chunk (state=3): >>><<< 7491 1727203968.21275: stdout chunk (state=3): >>><<< 7491 1727203968.21292: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203968.21296: handler run complete 7491 1727203968.21318: Evaluated conditional (False): False 7491 1727203968.21324: attempt loop complete, returning result 7491 1727203968.21339: variable 'item' from source: unknown 7491 1727203968.21404: variable 'item' from source: unknown ok: [managed-node3] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerveth0", "up" ], "delta": "0:00:00.003406", "end": "2024-09-24 14:52:48.174833", "item": "ip link set peerveth0 up", "rc": 0, "start": "2024-09-24 14:52:48.171427" } 7491 1727203968.21534: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203968.21538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203968.21540: variable 'omit' from source: magic vars 7491 1727203968.21635: variable 'ansible_distribution_major_version' from source: facts 7491 1727203968.21638: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203968.21759: variable 'type' from source: play vars 7491 1727203968.21763: variable 'state' from source: include params 7491 1727203968.21768: variable 'interface' from source: play vars 7491 1727203968.21772: variable 'current_interfaces' from source: set_fact 7491 1727203968.21777: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 7491 1727203968.21782: variable 'omit' from source: magic vars 7491 1727203968.21794: variable 'omit' from source: magic vars 7491 1727203968.21821: variable 'item' from source: unknown 7491 1727203968.21868: variable 'item' from source: unknown 7491 1727203968.21881: variable 'omit' from source: magic vars 7491 1727203968.21897: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203968.21904: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203968.21910: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203968.21921: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203968.21924: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203968.21926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203968.21980: Set connection var ansible_timeout to 10 7491 1727203968.21983: Set connection var ansible_pipelining to False 7491 1727203968.21990: Set connection var ansible_shell_type to sh 7491 1727203968.21996: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203968.22001: Set connection var ansible_shell_executable to /bin/sh 7491 1727203968.22006: Set connection var ansible_connection to ssh 7491 1727203968.22020: variable 'ansible_shell_executable' from source: unknown 7491 1727203968.22027: variable 'ansible_connection' from source: unknown 7491 1727203968.22029: variable 'ansible_module_compression' from source: unknown 7491 1727203968.22032: variable 'ansible_shell_type' from source: unknown 7491 1727203968.22034: variable 'ansible_shell_executable' from source: unknown 7491 1727203968.22036: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203968.22038: variable 'ansible_pipelining' from source: unknown 7491 1727203968.22042: variable 'ansible_timeout' from source: unknown 7491 1727203968.22046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203968.22114: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203968.22125: variable 'omit' from source: magic vars 7491 1727203968.22128: starting attempt loop 7491 1727203968.22130: running the handler 7491 1727203968.22133: _low_level_execute_command(): starting 7491 1727203968.22137: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203968.22620: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203968.22633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203968.22652: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203968.22665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203968.22713: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203968.22732: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203968.22783: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203968.24281: stdout chunk (state=3): >>>/root <<< 7491 1727203968.24377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203968.24436: stderr chunk (state=3): >>><<< 7491 1727203968.24441: stdout chunk (state=3): >>><<< 7491 1727203968.24460: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203968.24470: _low_level_execute_command(): starting 7491 1727203968.24477: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203968.244598-8229-94910302779318 `" && echo ansible-tmp-1727203968.244598-8229-94910302779318="` echo /root/.ansible/tmp/ansible-tmp-1727203968.244598-8229-94910302779318 `" ) && sleep 0' 7491 1727203968.24938: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203968.24951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203968.24973: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203968.24993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203968.25038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203968.25049: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203968.25103: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203968.26898: stdout chunk (state=3): >>>ansible-tmp-1727203968.244598-8229-94910302779318=/root/.ansible/tmp/ansible-tmp-1727203968.244598-8229-94910302779318 <<< 7491 1727203968.27003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203968.27071: stderr chunk (state=3): >>><<< 7491 1727203968.27075: stdout chunk (state=3): >>><<< 7491 1727203968.27090: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203968.244598-8229-94910302779318=/root/.ansible/tmp/ansible-tmp-1727203968.244598-8229-94910302779318 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203968.27107: variable 'ansible_module_compression' from source: unknown 7491 1727203968.27146: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7491 1727203968.27165: variable 'ansible_facts' from source: unknown 7491 1727203968.27212: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203968.244598-8229-94910302779318/AnsiballZ_command.py 7491 1727203968.27315: Sending initial data 7491 1727203968.27325: Sent initial data (152 bytes) 7491 1727203968.28042: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203968.28046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203968.28078: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203968.28081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203968.28087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203968.28143: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203968.28147: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203968.28149: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203968.28194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203968.29843: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 7491 1727203968.29846: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203968.29881: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203968.29922: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpg8s1bm5_ /root/.ansible/tmp/ansible-tmp-1727203968.244598-8229-94910302779318/AnsiballZ_command.py <<< 7491 1727203968.29956: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203968.30759: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203968.30875: stderr chunk (state=3): >>><<< 7491 1727203968.30878: stdout chunk (state=3): >>><<< 7491 1727203968.30894: done transferring module to remote 7491 1727203968.30905: _low_level_execute_command(): starting 7491 1727203968.30908: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203968.244598-8229-94910302779318/ /root/.ansible/tmp/ansible-tmp-1727203968.244598-8229-94910302779318/AnsiballZ_command.py && sleep 0' 7491 1727203968.31373: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203968.31391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203968.31406: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203968.31418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203968.31428: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203968.31476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203968.31488: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203968.31538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203968.33217: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203968.33274: stderr chunk (state=3): >>><<< 7491 1727203968.33280: stdout chunk (state=3): >>><<< 7491 1727203968.33297: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203968.33300: _low_level_execute_command(): starting 7491 1727203968.33305: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203968.244598-8229-94910302779318/AnsiballZ_command.py && sleep 0' 7491 1727203968.33782: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203968.33795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203968.33818: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203968.33835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203968.33880: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203968.33893: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203968.33950: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203968.47478: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-24 14:52:48.468085", "end": "2024-09-24 14:52:48.473990", "delta": "0:00:00.005905", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7491 1727203968.48598: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203968.48659: stderr chunk (state=3): >>><<< 7491 1727203968.48663: stdout chunk (state=3): >>><<< 7491 1727203968.48680: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-24 14:52:48.468085", "end": "2024-09-24 14:52:48.473990", "delta": "0:00:00.005905", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203968.48705: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set veth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203968.244598-8229-94910302779318/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203968.48709: _low_level_execute_command(): starting 7491 1727203968.48715: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203968.244598-8229-94910302779318/ > /dev/null 2>&1 && sleep 0' 7491 1727203968.49193: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203968.49209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203968.49229: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7491 1727203968.49242: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203968.49290: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203968.49304: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203968.49359: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203968.51088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203968.51142: stderr chunk (state=3): >>><<< 7491 1727203968.51145: stdout chunk (state=3): >>><<< 7491 1727203968.51160: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203968.51167: handler run complete 7491 1727203968.51183: Evaluated conditional (False): False 7491 1727203968.51190: attempt loop complete, returning result 7491 1727203968.51206: variable 'item' from source: unknown 7491 1727203968.51272: variable 'item' from source: unknown ok: [managed-node3] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "veth0", "up" ], "delta": "0:00:00.005905", "end": "2024-09-24 14:52:48.473990", "item": "ip link set veth0 up", "rc": 0, "start": "2024-09-24 14:52:48.468085" } 7491 1727203968.51394: dumping result to json 7491 1727203968.51397: done dumping result, returning 7491 1727203968.51399: done running TaskExecutor() for managed-node3/TASK: Create veth interface veth0 [0affcd87-79f5-0a4a-ad01-0000000003a9] 7491 1727203968.51400: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000003a9 7491 1727203968.51445: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000003a9 7491 1727203968.51448: WORKER PROCESS EXITING 7491 1727203968.51507: no more pending results, returning what we have 7491 1727203968.51511: results queue empty 7491 1727203968.51512: checking for any_errors_fatal 7491 1727203968.51519: done checking for any_errors_fatal 7491 1727203968.51520: checking for max_fail_percentage 7491 1727203968.51521: done checking for max_fail_percentage 7491 1727203968.51522: checking to see if all hosts have failed and the running result is not ok 7491 1727203968.51523: done checking to see if all hosts have failed 7491 1727203968.51523: getting the remaining hosts for this loop 7491 1727203968.51525: done getting the remaining hosts for this loop 7491 1727203968.51528: getting the next task for host managed-node3 7491 1727203968.51533: done getting next task for host managed-node3 7491 1727203968.51535: ^ task is: TASK: Set up veth as managed by NetworkManager 7491 1727203968.51538: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203968.51541: getting variables 7491 1727203968.51542: in VariableManager get_vars() 7491 1727203968.51596: Calling all_inventory to load vars for managed-node3 7491 1727203968.51599: Calling groups_inventory to load vars for managed-node3 7491 1727203968.51601: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203968.51610: Calling all_plugins_play to load vars for managed-node3 7491 1727203968.51612: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203968.51615: Calling groups_plugins_play to load vars for managed-node3 7491 1727203968.51766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203968.51883: done with get_vars() 7491 1727203968.51891: done getting variables 7491 1727203968.51937: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 14:52:48 -0400 (0:00:00.957) 0:00:10.443 ***** 7491 1727203968.51957: entering _queue_task() for managed-node3/command 7491 1727203968.52142: worker is 1 (out of 1 available) 7491 1727203968.52156: exiting _queue_task() for managed-node3/command 7491 1727203968.52170: done queuing things up, now waiting for results queue to drain 7491 1727203968.52172: waiting for pending results... 7491 1727203968.52324: running TaskExecutor() for managed-node3/TASK: Set up veth as managed by NetworkManager 7491 1727203968.52387: in run() - task 0affcd87-79f5-0a4a-ad01-0000000003aa 7491 1727203968.52400: variable 'ansible_search_path' from source: unknown 7491 1727203968.52403: variable 'ansible_search_path' from source: unknown 7491 1727203968.52432: calling self._execute() 7491 1727203968.52501: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203968.52505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203968.52514: variable 'omit' from source: magic vars 7491 1727203968.52770: variable 'ansible_distribution_major_version' from source: facts 7491 1727203968.52784: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203968.52895: variable 'type' from source: play vars 7491 1727203968.52898: variable 'state' from source: include params 7491 1727203968.52904: Evaluated conditional (type == 'veth' and state == 'present'): True 7491 1727203968.52911: variable 'omit' from source: magic vars 7491 1727203968.52940: variable 'omit' from source: magic vars 7491 1727203968.53012: variable 'interface' from source: play vars 7491 1727203968.53027: variable 'omit' from source: magic vars 7491 1727203968.53058: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203968.53086: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203968.53105: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203968.53117: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203968.53129: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203968.53152: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203968.53155: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203968.53157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203968.53232: Set connection var ansible_timeout to 10 7491 1727203968.53236: Set connection var ansible_pipelining to False 7491 1727203968.53241: Set connection var ansible_shell_type to sh 7491 1727203968.53247: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203968.53253: Set connection var ansible_shell_executable to /bin/sh 7491 1727203968.53258: Set connection var ansible_connection to ssh 7491 1727203968.53277: variable 'ansible_shell_executable' from source: unknown 7491 1727203968.53280: variable 'ansible_connection' from source: unknown 7491 1727203968.53282: variable 'ansible_module_compression' from source: unknown 7491 1727203968.53284: variable 'ansible_shell_type' from source: unknown 7491 1727203968.53287: variable 'ansible_shell_executable' from source: unknown 7491 1727203968.53291: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203968.53293: variable 'ansible_pipelining' from source: unknown 7491 1727203968.53296: variable 'ansible_timeout' from source: unknown 7491 1727203968.53300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203968.53402: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203968.53410: variable 'omit' from source: magic vars 7491 1727203968.53414: starting attempt loop 7491 1727203968.53424: running the handler 7491 1727203968.53434: _low_level_execute_command(): starting 7491 1727203968.53441: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203968.53966: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203968.53989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203968.54004: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7491 1727203968.54019: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203968.54067: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203968.54092: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203968.54136: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203968.55653: stdout chunk (state=3): >>>/root <<< 7491 1727203968.55752: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203968.55814: stderr chunk (state=3): >>><<< 7491 1727203968.55820: stdout chunk (state=3): >>><<< 7491 1727203968.55838: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203968.55850: _low_level_execute_command(): starting 7491 1727203968.55857: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203968.5583978-8258-95468090258402 `" && echo ansible-tmp-1727203968.5583978-8258-95468090258402="` echo /root/.ansible/tmp/ansible-tmp-1727203968.5583978-8258-95468090258402 `" ) && sleep 0' 7491 1727203968.56328: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203968.56350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203968.56376: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203968.56387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203968.56430: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203968.56443: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203968.56489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203968.58290: stdout chunk (state=3): >>>ansible-tmp-1727203968.5583978-8258-95468090258402=/root/.ansible/tmp/ansible-tmp-1727203968.5583978-8258-95468090258402 <<< 7491 1727203968.58402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203968.58458: stderr chunk (state=3): >>><<< 7491 1727203968.58461: stdout chunk (state=3): >>><<< 7491 1727203968.58479: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203968.5583978-8258-95468090258402=/root/.ansible/tmp/ansible-tmp-1727203968.5583978-8258-95468090258402 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203968.58507: variable 'ansible_module_compression' from source: unknown 7491 1727203968.58549: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7491 1727203968.58579: variable 'ansible_facts' from source: unknown 7491 1727203968.58645: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203968.5583978-8258-95468090258402/AnsiballZ_command.py 7491 1727203968.58758: Sending initial data 7491 1727203968.58763: Sent initial data (153 bytes) 7491 1727203968.59450: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203968.59453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203968.59493: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203968.59497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203968.59500: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203968.59556: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203968.59559: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203968.59562: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203968.59608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203968.61283: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203968.61321: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203968.61354: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmp4y_47704 /root/.ansible/tmp/ansible-tmp-1727203968.5583978-8258-95468090258402/AnsiballZ_command.py <<< 7491 1727203968.61392: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203968.62567: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203968.62780: stderr chunk (state=3): >>><<< 7491 1727203968.62784: stdout chunk (state=3): >>><<< 7491 1727203968.62787: done transferring module to remote 7491 1727203968.62789: _low_level_execute_command(): starting 7491 1727203968.62792: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203968.5583978-8258-95468090258402/ /root/.ansible/tmp/ansible-tmp-1727203968.5583978-8258-95468090258402/AnsiballZ_command.py && sleep 0' 7491 1727203968.63407: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203968.63411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203968.63467: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203968.63471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203968.63474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203968.63476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203968.63539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203968.63542: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203968.63548: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203968.63581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203968.65243: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203968.65297: stderr chunk (state=3): >>><<< 7491 1727203968.65301: stdout chunk (state=3): >>><<< 7491 1727203968.65318: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203968.65321: _low_level_execute_command(): starting 7491 1727203968.65324: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203968.5583978-8258-95468090258402/AnsiballZ_command.py && sleep 0' 7491 1727203968.65790: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203968.65796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203968.65816: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203968.65829: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203968.65839: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203968.65885: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203968.65897: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203968.65954: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203968.81571: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-24 14:52:48.787532", "end": "2024-09-24 14:52:48.814887", "delta": "0:00:00.027355", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7491 1727203968.82785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203968.82804: stderr chunk (state=3): >>>Shared connection to 10.31.15.87 closed. <<< 7491 1727203968.82861: stderr chunk (state=3): >>><<< 7491 1727203968.82866: stdout chunk (state=3): >>><<< 7491 1727203968.82882: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-24 14:52:48.787532", "end": "2024-09-24 14:52:48.814887", "delta": "0:00:00.027355", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203968.82918: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set veth0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203968.5583978-8258-95468090258402/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203968.82925: _low_level_execute_command(): starting 7491 1727203968.82930: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203968.5583978-8258-95468090258402/ > /dev/null 2>&1 && sleep 0' 7491 1727203968.83403: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203968.83416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203968.83442: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203968.83460: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203968.83503: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203968.83518: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203968.83574: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203968.85310: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203968.85365: stderr chunk (state=3): >>><<< 7491 1727203968.85369: stdout chunk (state=3): >>><<< 7491 1727203968.85386: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203968.85394: handler run complete 7491 1727203968.85411: Evaluated conditional (False): False 7491 1727203968.85421: attempt loop complete, returning result 7491 1727203968.85423: _execute() done 7491 1727203968.85426: dumping result to json 7491 1727203968.85428: done dumping result, returning 7491 1727203968.85436: done running TaskExecutor() for managed-node3/TASK: Set up veth as managed by NetworkManager [0affcd87-79f5-0a4a-ad01-0000000003aa] 7491 1727203968.85442: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000003aa 7491 1727203968.85538: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000003aa 7491 1727203968.85541: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "nmcli", "d", "set", "veth0", "managed", "true" ], "delta": "0:00:00.027355", "end": "2024-09-24 14:52:48.814887", "rc": 0, "start": "2024-09-24 14:52:48.787532" } 7491 1727203968.85638: no more pending results, returning what we have 7491 1727203968.85641: results queue empty 7491 1727203968.85642: checking for any_errors_fatal 7491 1727203968.85653: done checking for any_errors_fatal 7491 1727203968.85653: checking for max_fail_percentage 7491 1727203968.85655: done checking for max_fail_percentage 7491 1727203968.85656: checking to see if all hosts have failed and the running result is not ok 7491 1727203968.85657: done checking to see if all hosts have failed 7491 1727203968.85657: getting the remaining hosts for this loop 7491 1727203968.85659: done getting the remaining hosts for this loop 7491 1727203968.85662: getting the next task for host managed-node3 7491 1727203968.85669: done getting next task for host managed-node3 7491 1727203968.85672: ^ task is: TASK: Delete veth interface {{ interface }} 7491 1727203968.85674: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203968.85678: getting variables 7491 1727203968.85679: in VariableManager get_vars() 7491 1727203968.85724: Calling all_inventory to load vars for managed-node3 7491 1727203968.85727: Calling groups_inventory to load vars for managed-node3 7491 1727203968.85729: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203968.85738: Calling all_plugins_play to load vars for managed-node3 7491 1727203968.85740: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203968.85742: Calling groups_plugins_play to load vars for managed-node3 7491 1727203968.85853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203968.85976: done with get_vars() 7491 1727203968.85985: done getting variables 7491 1727203968.86028: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203968.86117: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 14:52:48 -0400 (0:00:00.341) 0:00:10.785 ***** 7491 1727203968.86140: entering _queue_task() for managed-node3/command 7491 1727203968.86326: worker is 1 (out of 1 available) 7491 1727203968.86340: exiting _queue_task() for managed-node3/command 7491 1727203968.86354: done queuing things up, now waiting for results queue to drain 7491 1727203968.86355: waiting for pending results... 7491 1727203968.86515: running TaskExecutor() for managed-node3/TASK: Delete veth interface veth0 7491 1727203968.86579: in run() - task 0affcd87-79f5-0a4a-ad01-0000000003ab 7491 1727203968.86592: variable 'ansible_search_path' from source: unknown 7491 1727203968.86596: variable 'ansible_search_path' from source: unknown 7491 1727203968.86629: calling self._execute() 7491 1727203968.86693: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203968.86699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203968.86715: variable 'omit' from source: magic vars 7491 1727203968.87034: variable 'ansible_distribution_major_version' from source: facts 7491 1727203968.87049: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203968.87186: variable 'type' from source: play vars 7491 1727203968.87189: variable 'state' from source: include params 7491 1727203968.87194: variable 'interface' from source: play vars 7491 1727203968.87198: variable 'current_interfaces' from source: set_fact 7491 1727203968.87205: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 7491 1727203968.87208: when evaluation is False, skipping this task 7491 1727203968.87210: _execute() done 7491 1727203968.87213: dumping result to json 7491 1727203968.87215: done dumping result, returning 7491 1727203968.87224: done running TaskExecutor() for managed-node3/TASK: Delete veth interface veth0 [0affcd87-79f5-0a4a-ad01-0000000003ab] 7491 1727203968.87230: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000003ab 7491 1727203968.87309: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000003ab 7491 1727203968.87311: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7491 1727203968.87358: no more pending results, returning what we have 7491 1727203968.87362: results queue empty 7491 1727203968.87363: checking for any_errors_fatal 7491 1727203968.87373: done checking for any_errors_fatal 7491 1727203968.87374: checking for max_fail_percentage 7491 1727203968.87376: done checking for max_fail_percentage 7491 1727203968.87377: checking to see if all hosts have failed and the running result is not ok 7491 1727203968.87378: done checking to see if all hosts have failed 7491 1727203968.87378: getting the remaining hosts for this loop 7491 1727203968.87380: done getting the remaining hosts for this loop 7491 1727203968.87383: getting the next task for host managed-node3 7491 1727203968.87388: done getting next task for host managed-node3 7491 1727203968.87391: ^ task is: TASK: Create dummy interface {{ interface }} 7491 1727203968.87394: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203968.87397: getting variables 7491 1727203968.87398: in VariableManager get_vars() 7491 1727203968.87439: Calling all_inventory to load vars for managed-node3 7491 1727203968.87442: Calling groups_inventory to load vars for managed-node3 7491 1727203968.87443: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203968.87452: Calling all_plugins_play to load vars for managed-node3 7491 1727203968.87454: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203968.87456: Calling groups_plugins_play to load vars for managed-node3 7491 1727203968.87611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203968.87727: done with get_vars() 7491 1727203968.87734: done getting variables 7491 1727203968.87776: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203968.87854: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 14:52:48 -0400 (0:00:00.017) 0:00:10.802 ***** 7491 1727203968.87877: entering _queue_task() for managed-node3/command 7491 1727203968.88050: worker is 1 (out of 1 available) 7491 1727203968.88066: exiting _queue_task() for managed-node3/command 7491 1727203968.88080: done queuing things up, now waiting for results queue to drain 7491 1727203968.88082: waiting for pending results... 7491 1727203968.88230: running TaskExecutor() for managed-node3/TASK: Create dummy interface veth0 7491 1727203968.88297: in run() - task 0affcd87-79f5-0a4a-ad01-0000000003ac 7491 1727203968.88308: variable 'ansible_search_path' from source: unknown 7491 1727203968.88311: variable 'ansible_search_path' from source: unknown 7491 1727203968.88343: calling self._execute() 7491 1727203968.88406: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203968.88411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203968.88423: variable 'omit' from source: magic vars 7491 1727203968.88674: variable 'ansible_distribution_major_version' from source: facts 7491 1727203968.88686: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203968.88815: variable 'type' from source: play vars 7491 1727203968.88823: variable 'state' from source: include params 7491 1727203968.88826: variable 'interface' from source: play vars 7491 1727203968.88830: variable 'current_interfaces' from source: set_fact 7491 1727203968.88837: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 7491 1727203968.88839: when evaluation is False, skipping this task 7491 1727203968.88842: _execute() done 7491 1727203968.88844: dumping result to json 7491 1727203968.88849: done dumping result, returning 7491 1727203968.88854: done running TaskExecutor() for managed-node3/TASK: Create dummy interface veth0 [0affcd87-79f5-0a4a-ad01-0000000003ac] 7491 1727203968.88861: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000003ac 7491 1727203968.88938: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000003ac 7491 1727203968.88941: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7491 1727203968.89001: no more pending results, returning what we have 7491 1727203968.89004: results queue empty 7491 1727203968.89005: checking for any_errors_fatal 7491 1727203968.89009: done checking for any_errors_fatal 7491 1727203968.89009: checking for max_fail_percentage 7491 1727203968.89011: done checking for max_fail_percentage 7491 1727203968.89011: checking to see if all hosts have failed and the running result is not ok 7491 1727203968.89012: done checking to see if all hosts have failed 7491 1727203968.89013: getting the remaining hosts for this loop 7491 1727203968.89014: done getting the remaining hosts for this loop 7491 1727203968.89017: getting the next task for host managed-node3 7491 1727203968.89022: done getting next task for host managed-node3 7491 1727203968.89024: ^ task is: TASK: Delete dummy interface {{ interface }} 7491 1727203968.89027: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203968.89030: getting variables 7491 1727203968.89031: in VariableManager get_vars() 7491 1727203968.89071: Calling all_inventory to load vars for managed-node3 7491 1727203968.89074: Calling groups_inventory to load vars for managed-node3 7491 1727203968.89075: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203968.89082: Calling all_plugins_play to load vars for managed-node3 7491 1727203968.89083: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203968.89085: Calling groups_plugins_play to load vars for managed-node3 7491 1727203968.89196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203968.89310: done with get_vars() 7491 1727203968.89318: done getting variables 7491 1727203968.89361: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203968.89443: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 14:52:48 -0400 (0:00:00.015) 0:00:10.818 ***** 7491 1727203968.89466: entering _queue_task() for managed-node3/command 7491 1727203968.89634: worker is 1 (out of 1 available) 7491 1727203968.89648: exiting _queue_task() for managed-node3/command 7491 1727203968.89662: done queuing things up, now waiting for results queue to drain 7491 1727203968.89663: waiting for pending results... 7491 1727203968.89813: running TaskExecutor() for managed-node3/TASK: Delete dummy interface veth0 7491 1727203968.89882: in run() - task 0affcd87-79f5-0a4a-ad01-0000000003ad 7491 1727203968.89893: variable 'ansible_search_path' from source: unknown 7491 1727203968.89896: variable 'ansible_search_path' from source: unknown 7491 1727203968.89929: calling self._execute() 7491 1727203968.89994: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203968.89998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203968.90006: variable 'omit' from source: magic vars 7491 1727203968.90254: variable 'ansible_distribution_major_version' from source: facts 7491 1727203968.90266: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203968.90395: variable 'type' from source: play vars 7491 1727203968.90399: variable 'state' from source: include params 7491 1727203968.90402: variable 'interface' from source: play vars 7491 1727203968.90407: variable 'current_interfaces' from source: set_fact 7491 1727203968.90413: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 7491 1727203968.90415: when evaluation is False, skipping this task 7491 1727203968.90418: _execute() done 7491 1727203968.90424: dumping result to json 7491 1727203968.90426: done dumping result, returning 7491 1727203968.90434: done running TaskExecutor() for managed-node3/TASK: Delete dummy interface veth0 [0affcd87-79f5-0a4a-ad01-0000000003ad] 7491 1727203968.90441: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000003ad 7491 1727203968.90517: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000003ad 7491 1727203968.90520: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7491 1727203968.90583: no more pending results, returning what we have 7491 1727203968.90586: results queue empty 7491 1727203968.90587: checking for any_errors_fatal 7491 1727203968.90591: done checking for any_errors_fatal 7491 1727203968.90592: checking for max_fail_percentage 7491 1727203968.90593: done checking for max_fail_percentage 7491 1727203968.90594: checking to see if all hosts have failed and the running result is not ok 7491 1727203968.90595: done checking to see if all hosts have failed 7491 1727203968.90595: getting the remaining hosts for this loop 7491 1727203968.90596: done getting the remaining hosts for this loop 7491 1727203968.90599: getting the next task for host managed-node3 7491 1727203968.90604: done getting next task for host managed-node3 7491 1727203968.90606: ^ task is: TASK: Create tap interface {{ interface }} 7491 1727203968.90608: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203968.90611: getting variables 7491 1727203968.90613: in VariableManager get_vars() 7491 1727203968.90649: Calling all_inventory to load vars for managed-node3 7491 1727203968.90651: Calling groups_inventory to load vars for managed-node3 7491 1727203968.90653: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203968.90660: Calling all_plugins_play to load vars for managed-node3 7491 1727203968.90661: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203968.90663: Calling groups_plugins_play to load vars for managed-node3 7491 1727203968.90806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203968.90923: done with get_vars() 7491 1727203968.90930: done getting variables 7491 1727203968.90975: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203968.91046: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 14:52:48 -0400 (0:00:00.016) 0:00:10.834 ***** 7491 1727203968.91070: entering _queue_task() for managed-node3/command 7491 1727203968.91236: worker is 1 (out of 1 available) 7491 1727203968.91249: exiting _queue_task() for managed-node3/command 7491 1727203968.91262: done queuing things up, now waiting for results queue to drain 7491 1727203968.91265: waiting for pending results... 7491 1727203968.91415: running TaskExecutor() for managed-node3/TASK: Create tap interface veth0 7491 1727203968.91479: in run() - task 0affcd87-79f5-0a4a-ad01-0000000003ae 7491 1727203968.91491: variable 'ansible_search_path' from source: unknown 7491 1727203968.91494: variable 'ansible_search_path' from source: unknown 7491 1727203968.91528: calling self._execute() 7491 1727203968.91589: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203968.91595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203968.91604: variable 'omit' from source: magic vars 7491 1727203968.91849: variable 'ansible_distribution_major_version' from source: facts 7491 1727203968.91858: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203968.91995: variable 'type' from source: play vars 7491 1727203968.91998: variable 'state' from source: include params 7491 1727203968.92003: variable 'interface' from source: play vars 7491 1727203968.92006: variable 'current_interfaces' from source: set_fact 7491 1727203968.92013: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 7491 1727203968.92016: when evaluation is False, skipping this task 7491 1727203968.92019: _execute() done 7491 1727203968.92024: dumping result to json 7491 1727203968.92027: done dumping result, returning 7491 1727203968.92033: done running TaskExecutor() for managed-node3/TASK: Create tap interface veth0 [0affcd87-79f5-0a4a-ad01-0000000003ae] 7491 1727203968.92040: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000003ae 7491 1727203968.92117: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000003ae 7491 1727203968.92120: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7491 1727203968.92189: no more pending results, returning what we have 7491 1727203968.92192: results queue empty 7491 1727203968.92193: checking for any_errors_fatal 7491 1727203968.92198: done checking for any_errors_fatal 7491 1727203968.92199: checking for max_fail_percentage 7491 1727203968.92200: done checking for max_fail_percentage 7491 1727203968.92201: checking to see if all hosts have failed and the running result is not ok 7491 1727203968.92202: done checking to see if all hosts have failed 7491 1727203968.92203: getting the remaining hosts for this loop 7491 1727203968.92204: done getting the remaining hosts for this loop 7491 1727203968.92207: getting the next task for host managed-node3 7491 1727203968.92211: done getting next task for host managed-node3 7491 1727203968.92213: ^ task is: TASK: Delete tap interface {{ interface }} 7491 1727203968.92215: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203968.92219: getting variables 7491 1727203968.92220: in VariableManager get_vars() 7491 1727203968.92251: Calling all_inventory to load vars for managed-node3 7491 1727203968.92252: Calling groups_inventory to load vars for managed-node3 7491 1727203968.92254: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203968.92266: Calling all_plugins_play to load vars for managed-node3 7491 1727203968.92269: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203968.92271: Calling groups_plugins_play to load vars for managed-node3 7491 1727203968.92382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203968.92498: done with get_vars() 7491 1727203968.92506: done getting variables 7491 1727203968.92547: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203968.92626: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 14:52:48 -0400 (0:00:00.015) 0:00:10.850 ***** 7491 1727203968.92646: entering _queue_task() for managed-node3/command 7491 1727203968.92813: worker is 1 (out of 1 available) 7491 1727203968.92827: exiting _queue_task() for managed-node3/command 7491 1727203968.92840: done queuing things up, now waiting for results queue to drain 7491 1727203968.92841: waiting for pending results... 7491 1727203968.92996: running TaskExecutor() for managed-node3/TASK: Delete tap interface veth0 7491 1727203968.93059: in run() - task 0affcd87-79f5-0a4a-ad01-0000000003af 7491 1727203968.93073: variable 'ansible_search_path' from source: unknown 7491 1727203968.93077: variable 'ansible_search_path' from source: unknown 7491 1727203968.93103: calling self._execute() 7491 1727203968.93168: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203968.93172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203968.93182: variable 'omit' from source: magic vars 7491 1727203968.93429: variable 'ansible_distribution_major_version' from source: facts 7491 1727203968.93439: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203968.93828: variable 'type' from source: play vars 7491 1727203968.93831: variable 'state' from source: include params 7491 1727203968.93834: variable 'interface' from source: play vars 7491 1727203968.93838: variable 'current_interfaces' from source: set_fact 7491 1727203968.93845: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 7491 1727203968.93847: when evaluation is False, skipping this task 7491 1727203968.93850: _execute() done 7491 1727203968.93852: dumping result to json 7491 1727203968.93856: done dumping result, returning 7491 1727203968.93862: done running TaskExecutor() for managed-node3/TASK: Delete tap interface veth0 [0affcd87-79f5-0a4a-ad01-0000000003af] 7491 1727203968.93869: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000003af 7491 1727203968.93948: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000003af 7491 1727203968.93951: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7491 1727203968.93998: no more pending results, returning what we have 7491 1727203968.94002: results queue empty 7491 1727203968.94003: checking for any_errors_fatal 7491 1727203968.94007: done checking for any_errors_fatal 7491 1727203968.94007: checking for max_fail_percentage 7491 1727203968.94009: done checking for max_fail_percentage 7491 1727203968.94009: checking to see if all hosts have failed and the running result is not ok 7491 1727203968.94010: done checking to see if all hosts have failed 7491 1727203968.94011: getting the remaining hosts for this loop 7491 1727203968.94013: done getting the remaining hosts for this loop 7491 1727203968.94016: getting the next task for host managed-node3 7491 1727203968.94022: done getting next task for host managed-node3 7491 1727203968.94025: ^ task is: TASK: Include the task 'assert_device_present.yml' 7491 1727203968.94027: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203968.94030: getting variables 7491 1727203968.94031: in VariableManager get_vars() 7491 1727203968.94077: Calling all_inventory to load vars for managed-node3 7491 1727203968.94079: Calling groups_inventory to load vars for managed-node3 7491 1727203968.94081: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203968.94088: Calling all_plugins_play to load vars for managed-node3 7491 1727203968.94089: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203968.94091: Calling groups_plugins_play to load vars for managed-node3 7491 1727203968.94370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203968.94482: done with get_vars() 7491 1727203968.94489: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:15 Tuesday 24 September 2024 14:52:48 -0400 (0:00:00.019) 0:00:10.869 ***** 7491 1727203968.94551: entering _queue_task() for managed-node3/include_tasks 7491 1727203968.94716: worker is 1 (out of 1 available) 7491 1727203968.94727: exiting _queue_task() for managed-node3/include_tasks 7491 1727203968.94740: done queuing things up, now waiting for results queue to drain 7491 1727203968.94741: waiting for pending results... 7491 1727203968.94900: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_present.yml' 7491 1727203968.94963: in run() - task 0affcd87-79f5-0a4a-ad01-00000000000d 7491 1727203968.94976: variable 'ansible_search_path' from source: unknown 7491 1727203968.95004: calling self._execute() 7491 1727203968.95070: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203968.95073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203968.95083: variable 'omit' from source: magic vars 7491 1727203968.95341: variable 'ansible_distribution_major_version' from source: facts 7491 1727203968.95350: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203968.95356: _execute() done 7491 1727203968.95359: dumping result to json 7491 1727203968.95364: done dumping result, returning 7491 1727203968.95370: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_present.yml' [0affcd87-79f5-0a4a-ad01-00000000000d] 7491 1727203968.95379: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000000d 7491 1727203968.95457: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000000d 7491 1727203968.95460: WORKER PROCESS EXITING 7491 1727203968.95496: no more pending results, returning what we have 7491 1727203968.95500: in VariableManager get_vars() 7491 1727203968.95546: Calling all_inventory to load vars for managed-node3 7491 1727203968.95549: Calling groups_inventory to load vars for managed-node3 7491 1727203968.95551: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203968.95559: Calling all_plugins_play to load vars for managed-node3 7491 1727203968.95561: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203968.95563: Calling groups_plugins_play to load vars for managed-node3 7491 1727203968.95686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203968.95799: done with get_vars() 7491 1727203968.95805: variable 'ansible_search_path' from source: unknown 7491 1727203968.95815: we have included files to process 7491 1727203968.95816: generating all_blocks data 7491 1727203968.95818: done generating all_blocks data 7491 1727203968.95822: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7491 1727203968.95822: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7491 1727203968.95824: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7491 1727203968.95932: in VariableManager get_vars() 7491 1727203968.95949: done with get_vars() 7491 1727203968.96023: done processing included file 7491 1727203968.96024: iterating over new_blocks loaded from include file 7491 1727203968.96025: in VariableManager get_vars() 7491 1727203968.96041: done with get_vars() 7491 1727203968.96042: filtering new block on tags 7491 1727203968.96053: done filtering new block on tags 7491 1727203968.96054: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node3 7491 1727203968.96058: extending task lists for all hosts with included blocks 7491 1727203968.98478: done extending task lists 7491 1727203968.98480: done processing included files 7491 1727203968.98480: results queue empty 7491 1727203968.98481: checking for any_errors_fatal 7491 1727203968.98483: done checking for any_errors_fatal 7491 1727203968.98483: checking for max_fail_percentage 7491 1727203968.98484: done checking for max_fail_percentage 7491 1727203968.98484: checking to see if all hosts have failed and the running result is not ok 7491 1727203968.98485: done checking to see if all hosts have failed 7491 1727203968.98485: getting the remaining hosts for this loop 7491 1727203968.98486: done getting the remaining hosts for this loop 7491 1727203968.98488: getting the next task for host managed-node3 7491 1727203968.98490: done getting next task for host managed-node3 7491 1727203968.98492: ^ task is: TASK: Include the task 'get_interface_stat.yml' 7491 1727203968.98494: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203968.98495: getting variables 7491 1727203968.98496: in VariableManager get_vars() 7491 1727203968.98514: Calling all_inventory to load vars for managed-node3 7491 1727203968.98516: Calling groups_inventory to load vars for managed-node3 7491 1727203968.98518: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203968.98524: Calling all_plugins_play to load vars for managed-node3 7491 1727203968.98526: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203968.98528: Calling groups_plugins_play to load vars for managed-node3 7491 1727203968.98615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203968.98737: done with get_vars() 7491 1727203968.98745: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:52:48 -0400 (0:00:00.042) 0:00:10.911 ***** 7491 1727203968.98799: entering _queue_task() for managed-node3/include_tasks 7491 1727203968.99006: worker is 1 (out of 1 available) 7491 1727203968.99018: exiting _queue_task() for managed-node3/include_tasks 7491 1727203968.99031: done queuing things up, now waiting for results queue to drain 7491 1727203968.99032: waiting for pending results... 7491 1727203968.99201: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 7491 1727203968.99267: in run() - task 0affcd87-79f5-0a4a-ad01-0000000005f5 7491 1727203968.99286: variable 'ansible_search_path' from source: unknown 7491 1727203968.99290: variable 'ansible_search_path' from source: unknown 7491 1727203968.99317: calling self._execute() 7491 1727203968.99389: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203968.99393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203968.99402: variable 'omit' from source: magic vars 7491 1727203968.99678: variable 'ansible_distribution_major_version' from source: facts 7491 1727203968.99688: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203968.99695: _execute() done 7491 1727203968.99698: dumping result to json 7491 1727203968.99701: done dumping result, returning 7491 1727203968.99709: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-0a4a-ad01-0000000005f5] 7491 1727203968.99713: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000005f5 7491 1727203968.99797: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000005f5 7491 1727203968.99799: WORKER PROCESS EXITING 7491 1727203968.99851: no more pending results, returning what we have 7491 1727203968.99856: in VariableManager get_vars() 7491 1727203968.99909: Calling all_inventory to load vars for managed-node3 7491 1727203968.99912: Calling groups_inventory to load vars for managed-node3 7491 1727203968.99914: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203968.99928: Calling all_plugins_play to load vars for managed-node3 7491 1727203968.99935: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203968.99938: Calling groups_plugins_play to load vars for managed-node3 7491 1727203969.00056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203969.00179: done with get_vars() 7491 1727203969.00185: variable 'ansible_search_path' from source: unknown 7491 1727203969.00186: variable 'ansible_search_path' from source: unknown 7491 1727203969.00209: we have included files to process 7491 1727203969.00210: generating all_blocks data 7491 1727203969.00211: done generating all_blocks data 7491 1727203969.00212: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7491 1727203969.00213: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7491 1727203969.00214: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7491 1727203969.00372: done processing included file 7491 1727203969.00374: iterating over new_blocks loaded from include file 7491 1727203969.00375: in VariableManager get_vars() 7491 1727203969.00390: done with get_vars() 7491 1727203969.00391: filtering new block on tags 7491 1727203969.00402: done filtering new block on tags 7491 1727203969.00403: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node3 7491 1727203969.00407: extending task lists for all hosts with included blocks 7491 1727203969.00465: done extending task lists 7491 1727203969.00467: done processing included files 7491 1727203969.00467: results queue empty 7491 1727203969.00468: checking for any_errors_fatal 7491 1727203969.00470: done checking for any_errors_fatal 7491 1727203969.00471: checking for max_fail_percentage 7491 1727203969.00471: done checking for max_fail_percentage 7491 1727203969.00472: checking to see if all hosts have failed and the running result is not ok 7491 1727203969.00473: done checking to see if all hosts have failed 7491 1727203969.00473: getting the remaining hosts for this loop 7491 1727203969.00474: done getting the remaining hosts for this loop 7491 1727203969.00476: getting the next task for host managed-node3 7491 1727203969.00479: done getting next task for host managed-node3 7491 1727203969.00480: ^ task is: TASK: Get stat for interface {{ interface }} 7491 1727203969.00483: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203969.00484: getting variables 7491 1727203969.00485: in VariableManager get_vars() 7491 1727203969.00496: Calling all_inventory to load vars for managed-node3 7491 1727203969.00498: Calling groups_inventory to load vars for managed-node3 7491 1727203969.00499: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203969.00503: Calling all_plugins_play to load vars for managed-node3 7491 1727203969.00507: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203969.00510: Calling groups_plugins_play to load vars for managed-node3 7491 1727203969.00622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203969.00731: done with get_vars() 7491 1727203969.00738: done getting variables 7491 1727203969.00850: variable 'interface' from source: play vars TASK [Get stat for interface veth0] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:52:49 -0400 (0:00:00.020) 0:00:10.932 ***** 7491 1727203969.00873: entering _queue_task() for managed-node3/stat 7491 1727203969.01061: worker is 1 (out of 1 available) 7491 1727203969.01076: exiting _queue_task() for managed-node3/stat 7491 1727203969.01089: done queuing things up, now waiting for results queue to drain 7491 1727203969.01090: waiting for pending results... 7491 1727203969.01252: running TaskExecutor() for managed-node3/TASK: Get stat for interface veth0 7491 1727203969.01325: in run() - task 0affcd87-79f5-0a4a-ad01-0000000007ee 7491 1727203969.01335: variable 'ansible_search_path' from source: unknown 7491 1727203969.01340: variable 'ansible_search_path' from source: unknown 7491 1727203969.01371: calling self._execute() 7491 1727203969.01435: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203969.01439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203969.01447: variable 'omit' from source: magic vars 7491 1727203969.01711: variable 'ansible_distribution_major_version' from source: facts 7491 1727203969.01723: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203969.01729: variable 'omit' from source: magic vars 7491 1727203969.01766: variable 'omit' from source: magic vars 7491 1727203969.01836: variable 'interface' from source: play vars 7491 1727203969.01849: variable 'omit' from source: magic vars 7491 1727203969.01885: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203969.01913: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203969.01931: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203969.01944: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203969.01953: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203969.01978: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203969.01982: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203969.01984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203969.02055: Set connection var ansible_timeout to 10 7491 1727203969.02060: Set connection var ansible_pipelining to False 7491 1727203969.02067: Set connection var ansible_shell_type to sh 7491 1727203969.02074: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203969.02081: Set connection var ansible_shell_executable to /bin/sh 7491 1727203969.02085: Set connection var ansible_connection to ssh 7491 1727203969.02102: variable 'ansible_shell_executable' from source: unknown 7491 1727203969.02106: variable 'ansible_connection' from source: unknown 7491 1727203969.02110: variable 'ansible_module_compression' from source: unknown 7491 1727203969.02112: variable 'ansible_shell_type' from source: unknown 7491 1727203969.02115: variable 'ansible_shell_executable' from source: unknown 7491 1727203969.02117: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203969.02119: variable 'ansible_pipelining' from source: unknown 7491 1727203969.02121: variable 'ansible_timeout' from source: unknown 7491 1727203969.02126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203969.02271: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7491 1727203969.02280: variable 'omit' from source: magic vars 7491 1727203969.02286: starting attempt loop 7491 1727203969.02288: running the handler 7491 1727203969.02299: _low_level_execute_command(): starting 7491 1727203969.02306: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203969.02840: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203969.02855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203969.02870: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203969.02884: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203969.02902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203969.02940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203969.02955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203969.03008: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203969.04618: stdout chunk (state=3): >>>/root <<< 7491 1727203969.04708: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203969.04770: stderr chunk (state=3): >>><<< 7491 1727203969.04777: stdout chunk (state=3): >>><<< 7491 1727203969.04799: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203969.04810: _low_level_execute_command(): starting 7491 1727203969.04815: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203969.0479884-8280-25203375970780 `" && echo ansible-tmp-1727203969.0479884-8280-25203375970780="` echo /root/.ansible/tmp/ansible-tmp-1727203969.0479884-8280-25203375970780 `" ) && sleep 0' 7491 1727203969.05275: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203969.05288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203969.05309: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203969.05326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203969.05382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203969.05397: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203969.05439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203969.07258: stdout chunk (state=3): >>>ansible-tmp-1727203969.0479884-8280-25203375970780=/root/.ansible/tmp/ansible-tmp-1727203969.0479884-8280-25203375970780 <<< 7491 1727203969.07368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203969.07424: stderr chunk (state=3): >>><<< 7491 1727203969.07428: stdout chunk (state=3): >>><<< 7491 1727203969.07445: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203969.0479884-8280-25203375970780=/root/.ansible/tmp/ansible-tmp-1727203969.0479884-8280-25203375970780 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203969.07486: variable 'ansible_module_compression' from source: unknown 7491 1727203969.07537: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 7491 1727203969.07566: variable 'ansible_facts' from source: unknown 7491 1727203969.07632: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203969.0479884-8280-25203375970780/AnsiballZ_stat.py 7491 1727203969.07754: Sending initial data 7491 1727203969.07763: Sent initial data (150 bytes) 7491 1727203969.08442: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203969.08445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203969.08469: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203969.08481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203969.08538: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203969.08541: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203969.08589: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203969.10263: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203969.10296: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203969.10345: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpooqof_o9 /root/.ansible/tmp/ansible-tmp-1727203969.0479884-8280-25203375970780/AnsiballZ_stat.py <<< 7491 1727203969.10388: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203969.11182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203969.11288: stderr chunk (state=3): >>><<< 7491 1727203969.11291: stdout chunk (state=3): >>><<< 7491 1727203969.11310: done transferring module to remote 7491 1727203969.11321: _low_level_execute_command(): starting 7491 1727203969.11324: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203969.0479884-8280-25203375970780/ /root/.ansible/tmp/ansible-tmp-1727203969.0479884-8280-25203375970780/AnsiballZ_stat.py && sleep 0' 7491 1727203969.11787: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203969.11802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203969.11817: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203969.11829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203969.11844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203969.11892: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203969.11904: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203969.11956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203969.13639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203969.13697: stderr chunk (state=3): >>><<< 7491 1727203969.13700: stdout chunk (state=3): >>><<< 7491 1727203969.13714: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203969.13725: _low_level_execute_command(): starting 7491 1727203969.13729: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203969.0479884-8280-25203375970780/AnsiballZ_stat.py && sleep 0' 7491 1727203969.14177: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203969.14189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203969.14209: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203969.14225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203969.14271: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203969.14287: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203969.14338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203969.27471: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 23327, "dev": 21, "nlink": 1, "atime": 1727203967.8445623, "mtime": 1727203967.8445623, "ctime": 1727203967.8445623, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 7491 1727203969.28428: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203969.28495: stderr chunk (state=3): >>><<< 7491 1727203969.28499: stdout chunk (state=3): >>><<< 7491 1727203969.28525: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 23327, "dev": 21, "nlink": 1, "atime": 1727203967.8445623, "mtime": 1727203967.8445623, "ctime": 1727203967.8445623, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203969.28567: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203969.0479884-8280-25203375970780/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203969.28576: _low_level_execute_command(): starting 7491 1727203969.28585: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203969.0479884-8280-25203375970780/ > /dev/null 2>&1 && sleep 0' 7491 1727203969.29068: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203969.29083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203969.29094: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203969.29106: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203969.29115: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203969.29170: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203969.29183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203969.29228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203969.30987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203969.31045: stderr chunk (state=3): >>><<< 7491 1727203969.31050: stdout chunk (state=3): >>><<< 7491 1727203969.31065: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203969.31073: handler run complete 7491 1727203969.31108: attempt loop complete, returning result 7491 1727203969.31111: _execute() done 7491 1727203969.31114: dumping result to json 7491 1727203969.31121: done dumping result, returning 7491 1727203969.31128: done running TaskExecutor() for managed-node3/TASK: Get stat for interface veth0 [0affcd87-79f5-0a4a-ad01-0000000007ee] 7491 1727203969.31133: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000007ee 7491 1727203969.31239: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000007ee 7491 1727203969.31242: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "atime": 1727203967.8445623, "block_size": 4096, "blocks": 0, "ctime": 1727203967.8445623, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 23327, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "mode": "0777", "mtime": 1727203967.8445623, "nlink": 1, "path": "/sys/class/net/veth0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 7491 1727203969.31357: no more pending results, returning what we have 7491 1727203969.31360: results queue empty 7491 1727203969.31361: checking for any_errors_fatal 7491 1727203969.31363: done checking for any_errors_fatal 7491 1727203969.31367: checking for max_fail_percentage 7491 1727203969.31369: done checking for max_fail_percentage 7491 1727203969.31370: checking to see if all hosts have failed and the running result is not ok 7491 1727203969.31371: done checking to see if all hosts have failed 7491 1727203969.31372: getting the remaining hosts for this loop 7491 1727203969.31373: done getting the remaining hosts for this loop 7491 1727203969.31377: getting the next task for host managed-node3 7491 1727203969.31383: done getting next task for host managed-node3 7491 1727203969.31386: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 7491 1727203969.31389: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203969.31392: getting variables 7491 1727203969.31394: in VariableManager get_vars() 7491 1727203969.31441: Calling all_inventory to load vars for managed-node3 7491 1727203969.31443: Calling groups_inventory to load vars for managed-node3 7491 1727203969.31445: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203969.31453: Calling all_plugins_play to load vars for managed-node3 7491 1727203969.31454: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203969.31456: Calling groups_plugins_play to load vars for managed-node3 7491 1727203969.31577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203969.31697: done with get_vars() 7491 1727203969.31706: done getting variables 7491 1727203969.31785: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 7491 1727203969.31878: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'veth0'] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:52:49 -0400 (0:00:00.310) 0:00:11.242 ***** 7491 1727203969.31901: entering _queue_task() for managed-node3/assert 7491 1727203969.31902: Creating lock for assert 7491 1727203969.32108: worker is 1 (out of 1 available) 7491 1727203969.32125: exiting _queue_task() for managed-node3/assert 7491 1727203969.32138: done queuing things up, now waiting for results queue to drain 7491 1727203969.32139: waiting for pending results... 7491 1727203969.32298: running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'veth0' 7491 1727203969.32364: in run() - task 0affcd87-79f5-0a4a-ad01-0000000005f6 7491 1727203969.32375: variable 'ansible_search_path' from source: unknown 7491 1727203969.32378: variable 'ansible_search_path' from source: unknown 7491 1727203969.32411: calling self._execute() 7491 1727203969.32478: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203969.32481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203969.32490: variable 'omit' from source: magic vars 7491 1727203969.32813: variable 'ansible_distribution_major_version' from source: facts 7491 1727203969.32827: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203969.32830: variable 'omit' from source: magic vars 7491 1727203969.32859: variable 'omit' from source: magic vars 7491 1727203969.32928: variable 'interface' from source: play vars 7491 1727203969.32941: variable 'omit' from source: magic vars 7491 1727203969.32979: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203969.33004: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203969.33023: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203969.33036: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203969.33046: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203969.33071: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203969.33075: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203969.33078: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203969.33143: Set connection var ansible_timeout to 10 7491 1727203969.33152: Set connection var ansible_pipelining to False 7491 1727203969.33154: Set connection var ansible_shell_type to sh 7491 1727203969.33160: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203969.33168: Set connection var ansible_shell_executable to /bin/sh 7491 1727203969.33173: Set connection var ansible_connection to ssh 7491 1727203969.33192: variable 'ansible_shell_executable' from source: unknown 7491 1727203969.33196: variable 'ansible_connection' from source: unknown 7491 1727203969.33199: variable 'ansible_module_compression' from source: unknown 7491 1727203969.33201: variable 'ansible_shell_type' from source: unknown 7491 1727203969.33203: variable 'ansible_shell_executable' from source: unknown 7491 1727203969.33205: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203969.33207: variable 'ansible_pipelining' from source: unknown 7491 1727203969.33211: variable 'ansible_timeout' from source: unknown 7491 1727203969.33215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203969.33320: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203969.33326: variable 'omit' from source: magic vars 7491 1727203969.33331: starting attempt loop 7491 1727203969.33334: running the handler 7491 1727203969.33428: variable 'interface_stat' from source: set_fact 7491 1727203969.33442: Evaluated conditional (interface_stat.stat.exists): True 7491 1727203969.33447: handler run complete 7491 1727203969.33458: attempt loop complete, returning result 7491 1727203969.33461: _execute() done 7491 1727203969.33465: dumping result to json 7491 1727203969.33468: done dumping result, returning 7491 1727203969.33476: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'veth0' [0affcd87-79f5-0a4a-ad01-0000000005f6] 7491 1727203969.33481: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000005f6 7491 1727203969.33563: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000005f6 7491 1727203969.33568: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 7491 1727203969.33645: no more pending results, returning what we have 7491 1727203969.33648: results queue empty 7491 1727203969.33649: checking for any_errors_fatal 7491 1727203969.33656: done checking for any_errors_fatal 7491 1727203969.33656: checking for max_fail_percentage 7491 1727203969.33658: done checking for max_fail_percentage 7491 1727203969.33658: checking to see if all hosts have failed and the running result is not ok 7491 1727203969.33659: done checking to see if all hosts have failed 7491 1727203969.33660: getting the remaining hosts for this loop 7491 1727203969.33662: done getting the remaining hosts for this loop 7491 1727203969.33667: getting the next task for host managed-node3 7491 1727203969.33674: done getting next task for host managed-node3 7491 1727203969.33676: ^ task is: TASK: TEST: I can configure an interface with auto_gateway enabled 7491 1727203969.33682: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203969.33685: getting variables 7491 1727203969.33686: in VariableManager get_vars() 7491 1727203969.33732: Calling all_inventory to load vars for managed-node3 7491 1727203969.33734: Calling groups_inventory to load vars for managed-node3 7491 1727203969.33736: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203969.33743: Calling all_plugins_play to load vars for managed-node3 7491 1727203969.33745: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203969.33747: Calling groups_plugins_play to load vars for managed-node3 7491 1727203969.33896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203969.34022: done with get_vars() 7491 1727203969.34030: done getting variables 7491 1727203969.34071: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST: I can configure an interface with auto_gateway enabled] ************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:17 Tuesday 24 September 2024 14:52:49 -0400 (0:00:00.021) 0:00:11.264 ***** 7491 1727203969.34091: entering _queue_task() for managed-node3/debug 7491 1727203969.34280: worker is 1 (out of 1 available) 7491 1727203969.34293: exiting _queue_task() for managed-node3/debug 7491 1727203969.34306: done queuing things up, now waiting for results queue to drain 7491 1727203969.34307: waiting for pending results... 7491 1727203969.34462: running TaskExecutor() for managed-node3/TASK: TEST: I can configure an interface with auto_gateway enabled 7491 1727203969.34523: in run() - task 0affcd87-79f5-0a4a-ad01-00000000000e 7491 1727203969.34531: variable 'ansible_search_path' from source: unknown 7491 1727203969.34563: calling self._execute() 7491 1727203969.34631: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203969.34635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203969.34644: variable 'omit' from source: magic vars 7491 1727203969.34907: variable 'ansible_distribution_major_version' from source: facts 7491 1727203969.34918: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203969.34922: variable 'omit' from source: magic vars 7491 1727203969.34937: variable 'omit' from source: magic vars 7491 1727203969.34961: variable 'omit' from source: magic vars 7491 1727203969.34995: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203969.35022: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203969.35039: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203969.35052: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203969.35069: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203969.35092: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203969.35097: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203969.35099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203969.35169: Set connection var ansible_timeout to 10 7491 1727203969.35175: Set connection var ansible_pipelining to False 7491 1727203969.35180: Set connection var ansible_shell_type to sh 7491 1727203969.35185: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203969.35194: Set connection var ansible_shell_executable to /bin/sh 7491 1727203969.35197: Set connection var ansible_connection to ssh 7491 1727203969.35219: variable 'ansible_shell_executable' from source: unknown 7491 1727203969.35222: variable 'ansible_connection' from source: unknown 7491 1727203969.35225: variable 'ansible_module_compression' from source: unknown 7491 1727203969.35229: variable 'ansible_shell_type' from source: unknown 7491 1727203969.35231: variable 'ansible_shell_executable' from source: unknown 7491 1727203969.35234: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203969.35236: variable 'ansible_pipelining' from source: unknown 7491 1727203969.35238: variable 'ansible_timeout' from source: unknown 7491 1727203969.35240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203969.35339: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203969.35350: variable 'omit' from source: magic vars 7491 1727203969.35354: starting attempt loop 7491 1727203969.35357: running the handler 7491 1727203969.35393: handler run complete 7491 1727203969.35405: attempt loop complete, returning result 7491 1727203969.35409: _execute() done 7491 1727203969.35411: dumping result to json 7491 1727203969.35415: done dumping result, returning 7491 1727203969.35421: done running TaskExecutor() for managed-node3/TASK: TEST: I can configure an interface with auto_gateway enabled [0affcd87-79f5-0a4a-ad01-00000000000e] 7491 1727203969.35423: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000000e 7491 1727203969.35508: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000000e 7491 1727203969.35511: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: ################################################## 7491 1727203969.35585: no more pending results, returning what we have 7491 1727203969.35588: results queue empty 7491 1727203969.35589: checking for any_errors_fatal 7491 1727203969.35594: done checking for any_errors_fatal 7491 1727203969.35595: checking for max_fail_percentage 7491 1727203969.35596: done checking for max_fail_percentage 7491 1727203969.35597: checking to see if all hosts have failed and the running result is not ok 7491 1727203969.35598: done checking to see if all hosts have failed 7491 1727203969.35599: getting the remaining hosts for this loop 7491 1727203969.35600: done getting the remaining hosts for this loop 7491 1727203969.35603: getting the next task for host managed-node3 7491 1727203969.35608: done getting next task for host managed-node3 7491 1727203969.35613: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7491 1727203969.35615: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203969.35631: getting variables 7491 1727203969.35633: in VariableManager get_vars() 7491 1727203969.35671: Calling all_inventory to load vars for managed-node3 7491 1727203969.35673: Calling groups_inventory to load vars for managed-node3 7491 1727203969.35674: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203969.35681: Calling all_plugins_play to load vars for managed-node3 7491 1727203969.35684: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203969.35687: Calling groups_plugins_play to load vars for managed-node3 7491 1727203969.35801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203969.35926: done with get_vars() 7491 1727203969.35934: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:52:49 -0400 (0:00:00.019) 0:00:11.283 ***** 7491 1727203969.36000: entering _queue_task() for managed-node3/include_tasks 7491 1727203969.36183: worker is 1 (out of 1 available) 7491 1727203969.36196: exiting _queue_task() for managed-node3/include_tasks 7491 1727203969.36208: done queuing things up, now waiting for results queue to drain 7491 1727203969.36210: waiting for pending results... 7491 1727203969.36371: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7491 1727203969.36452: in run() - task 0affcd87-79f5-0a4a-ad01-000000000016 7491 1727203969.36462: variable 'ansible_search_path' from source: unknown 7491 1727203969.36468: variable 'ansible_search_path' from source: unknown 7491 1727203969.36495: calling self._execute() 7491 1727203969.36622: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203969.36626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203969.36637: variable 'omit' from source: magic vars 7491 1727203969.36887: variable 'ansible_distribution_major_version' from source: facts 7491 1727203969.36897: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203969.36903: _execute() done 7491 1727203969.36906: dumping result to json 7491 1727203969.36909: done dumping result, returning 7491 1727203969.36915: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-0a4a-ad01-000000000016] 7491 1727203969.36921: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000016 7491 1727203969.37002: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000016 7491 1727203969.37005: WORKER PROCESS EXITING 7491 1727203969.37049: no more pending results, returning what we have 7491 1727203969.37053: in VariableManager get_vars() 7491 1727203969.37104: Calling all_inventory to load vars for managed-node3 7491 1727203969.37106: Calling groups_inventory to load vars for managed-node3 7491 1727203969.37161: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203969.37171: Calling all_plugins_play to load vars for managed-node3 7491 1727203969.37173: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203969.37175: Calling groups_plugins_play to load vars for managed-node3 7491 1727203969.37275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203969.37397: done with get_vars() 7491 1727203969.37403: variable 'ansible_search_path' from source: unknown 7491 1727203969.37404: variable 'ansible_search_path' from source: unknown 7491 1727203969.37431: we have included files to process 7491 1727203969.37432: generating all_blocks data 7491 1727203969.37433: done generating all_blocks data 7491 1727203969.37436: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7491 1727203969.37436: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7491 1727203969.37438: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7491 1727203969.37910: done processing included file 7491 1727203969.37911: iterating over new_blocks loaded from include file 7491 1727203969.37912: in VariableManager get_vars() 7491 1727203969.37932: done with get_vars() 7491 1727203969.37934: filtering new block on tags 7491 1727203969.37944: done filtering new block on tags 7491 1727203969.37946: in VariableManager get_vars() 7491 1727203969.37962: done with get_vars() 7491 1727203969.37963: filtering new block on tags 7491 1727203969.37977: done filtering new block on tags 7491 1727203969.37978: in VariableManager get_vars() 7491 1727203969.37998: done with get_vars() 7491 1727203969.37999: filtering new block on tags 7491 1727203969.38010: done filtering new block on tags 7491 1727203969.38012: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 7491 1727203969.38016: extending task lists for all hosts with included blocks 7491 1727203969.38500: done extending task lists 7491 1727203969.38502: done processing included files 7491 1727203969.38502: results queue empty 7491 1727203969.38503: checking for any_errors_fatal 7491 1727203969.38505: done checking for any_errors_fatal 7491 1727203969.38506: checking for max_fail_percentage 7491 1727203969.38506: done checking for max_fail_percentage 7491 1727203969.38507: checking to see if all hosts have failed and the running result is not ok 7491 1727203969.38507: done checking to see if all hosts have failed 7491 1727203969.38508: getting the remaining hosts for this loop 7491 1727203969.38509: done getting the remaining hosts for this loop 7491 1727203969.38510: getting the next task for host managed-node3 7491 1727203969.38513: done getting next task for host managed-node3 7491 1727203969.38515: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7491 1727203969.38518: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203969.38525: getting variables 7491 1727203969.38526: in VariableManager get_vars() 7491 1727203969.38541: Calling all_inventory to load vars for managed-node3 7491 1727203969.38542: Calling groups_inventory to load vars for managed-node3 7491 1727203969.38544: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203969.38547: Calling all_plugins_play to load vars for managed-node3 7491 1727203969.38549: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203969.38550: Calling groups_plugins_play to load vars for managed-node3 7491 1727203969.38634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203969.38749: done with get_vars() 7491 1727203969.38757: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:52:49 -0400 (0:00:00.028) 0:00:11.311 ***** 7491 1727203969.38808: entering _queue_task() for managed-node3/setup 7491 1727203969.39024: worker is 1 (out of 1 available) 7491 1727203969.39037: exiting _queue_task() for managed-node3/setup 7491 1727203969.39051: done queuing things up, now waiting for results queue to drain 7491 1727203969.39052: waiting for pending results... 7491 1727203969.39228: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7491 1727203969.39320: in run() - task 0affcd87-79f5-0a4a-ad01-000000000809 7491 1727203969.39334: variable 'ansible_search_path' from source: unknown 7491 1727203969.39338: variable 'ansible_search_path' from source: unknown 7491 1727203969.39371: calling self._execute() 7491 1727203969.39432: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203969.39436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203969.39445: variable 'omit' from source: magic vars 7491 1727203969.39713: variable 'ansible_distribution_major_version' from source: facts 7491 1727203969.39729: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203969.39912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203969.41501: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203969.41761: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203969.41793: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203969.41821: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203969.41839: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203969.41901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203969.41922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203969.41940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203969.41968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203969.41979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203969.42022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203969.42038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203969.42055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203969.42082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203969.42092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203969.42512: variable '__network_required_facts' from source: role '' defaults 7491 1727203969.42515: variable 'ansible_facts' from source: unknown 7491 1727203969.42520: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 7491 1727203969.42522: when evaluation is False, skipping this task 7491 1727203969.42524: _execute() done 7491 1727203969.42525: dumping result to json 7491 1727203969.42527: done dumping result, returning 7491 1727203969.42529: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-0a4a-ad01-000000000809] 7491 1727203969.42531: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000809 7491 1727203969.42600: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000809 7491 1727203969.42604: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7491 1727203969.42669: no more pending results, returning what we have 7491 1727203969.42673: results queue empty 7491 1727203969.42675: checking for any_errors_fatal 7491 1727203969.42676: done checking for any_errors_fatal 7491 1727203969.42677: checking for max_fail_percentage 7491 1727203969.42679: done checking for max_fail_percentage 7491 1727203969.42680: checking to see if all hosts have failed and the running result is not ok 7491 1727203969.42681: done checking to see if all hosts have failed 7491 1727203969.42682: getting the remaining hosts for this loop 7491 1727203969.42684: done getting the remaining hosts for this loop 7491 1727203969.42687: getting the next task for host managed-node3 7491 1727203969.42696: done getting next task for host managed-node3 7491 1727203969.42700: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 7491 1727203969.42704: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203969.42728: getting variables 7491 1727203969.42730: in VariableManager get_vars() 7491 1727203969.42788: Calling all_inventory to load vars for managed-node3 7491 1727203969.42792: Calling groups_inventory to load vars for managed-node3 7491 1727203969.42795: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203969.42804: Calling all_plugins_play to load vars for managed-node3 7491 1727203969.42807: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203969.42810: Calling groups_plugins_play to load vars for managed-node3 7491 1727203969.43040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203969.43385: done with get_vars() 7491 1727203969.43395: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:52:49 -0400 (0:00:00.046) 0:00:11.358 ***** 7491 1727203969.43482: entering _queue_task() for managed-node3/stat 7491 1727203969.43702: worker is 1 (out of 1 available) 7491 1727203969.43713: exiting _queue_task() for managed-node3/stat 7491 1727203969.43730: done queuing things up, now waiting for results queue to drain 7491 1727203969.43732: waiting for pending results... 7491 1727203969.43908: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 7491 1727203969.44005: in run() - task 0affcd87-79f5-0a4a-ad01-00000000080b 7491 1727203969.44021: variable 'ansible_search_path' from source: unknown 7491 1727203969.44025: variable 'ansible_search_path' from source: unknown 7491 1727203969.44051: calling self._execute() 7491 1727203969.44118: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203969.44123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203969.44128: variable 'omit' from source: magic vars 7491 1727203969.44388: variable 'ansible_distribution_major_version' from source: facts 7491 1727203969.44398: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203969.44515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727203969.44709: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727203969.44745: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727203969.44771: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727203969.44802: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727203969.44861: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727203969.44901: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727203969.44924: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203969.44944: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727203969.45006: variable '__network_is_ostree' from source: set_fact 7491 1727203969.45013: Evaluated conditional (not __network_is_ostree is defined): False 7491 1727203969.45019: when evaluation is False, skipping this task 7491 1727203969.45021: _execute() done 7491 1727203969.45024: dumping result to json 7491 1727203969.45026: done dumping result, returning 7491 1727203969.45029: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-0a4a-ad01-00000000080b] 7491 1727203969.45036: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000080b 7491 1727203969.45121: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000080b 7491 1727203969.45124: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7491 1727203969.45300: no more pending results, returning what we have 7491 1727203969.45303: results queue empty 7491 1727203969.45305: checking for any_errors_fatal 7491 1727203969.45312: done checking for any_errors_fatal 7491 1727203969.45313: checking for max_fail_percentage 7491 1727203969.45314: done checking for max_fail_percentage 7491 1727203969.45315: checking to see if all hosts have failed and the running result is not ok 7491 1727203969.45316: done checking to see if all hosts have failed 7491 1727203969.45317: getting the remaining hosts for this loop 7491 1727203969.45319: done getting the remaining hosts for this loop 7491 1727203969.45322: getting the next task for host managed-node3 7491 1727203969.45328: done getting next task for host managed-node3 7491 1727203969.45332: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7491 1727203969.45336: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203969.45350: getting variables 7491 1727203969.45352: in VariableManager get_vars() 7491 1727203969.45410: Calling all_inventory to load vars for managed-node3 7491 1727203969.45413: Calling groups_inventory to load vars for managed-node3 7491 1727203969.45416: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203969.45427: Calling all_plugins_play to load vars for managed-node3 7491 1727203969.45430: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203969.45433: Calling groups_plugins_play to load vars for managed-node3 7491 1727203969.45999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203969.46233: done with get_vars() 7491 1727203969.46246: done getting variables 7491 1727203969.46306: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:52:49 -0400 (0:00:00.028) 0:00:11.387 ***** 7491 1727203969.46350: entering _queue_task() for managed-node3/set_fact 7491 1727203969.46631: worker is 1 (out of 1 available) 7491 1727203969.46651: exiting _queue_task() for managed-node3/set_fact 7491 1727203969.46669: done queuing things up, now waiting for results queue to drain 7491 1727203969.46670: waiting for pending results... 7491 1727203969.47036: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7491 1727203969.47210: in run() - task 0affcd87-79f5-0a4a-ad01-00000000080c 7491 1727203969.47251: variable 'ansible_search_path' from source: unknown 7491 1727203969.47260: variable 'ansible_search_path' from source: unknown 7491 1727203969.47314: calling self._execute() 7491 1727203969.47433: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203969.47446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203969.47466: variable 'omit' from source: magic vars 7491 1727203969.47945: variable 'ansible_distribution_major_version' from source: facts 7491 1727203969.47971: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203969.48158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727203969.48469: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727203969.48526: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727203969.48573: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727203969.48614: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727203969.48715: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727203969.48756: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727203969.48795: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203969.48832: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727203969.48938: variable '__network_is_ostree' from source: set_fact 7491 1727203969.48956: Evaluated conditional (not __network_is_ostree is defined): False 7491 1727203969.48967: when evaluation is False, skipping this task 7491 1727203969.48975: _execute() done 7491 1727203969.48985: dumping result to json 7491 1727203969.48994: done dumping result, returning 7491 1727203969.49006: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-0a4a-ad01-00000000080c] 7491 1727203969.49021: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000080c skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7491 1727203969.49178: no more pending results, returning what we have 7491 1727203969.49182: results queue empty 7491 1727203969.49184: checking for any_errors_fatal 7491 1727203969.49188: done checking for any_errors_fatal 7491 1727203969.49189: checking for max_fail_percentage 7491 1727203969.49191: done checking for max_fail_percentage 7491 1727203969.49192: checking to see if all hosts have failed and the running result is not ok 7491 1727203969.49193: done checking to see if all hosts have failed 7491 1727203969.49194: getting the remaining hosts for this loop 7491 1727203969.49197: done getting the remaining hosts for this loop 7491 1727203969.49201: getting the next task for host managed-node3 7491 1727203969.49210: done getting next task for host managed-node3 7491 1727203969.49214: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 7491 1727203969.49220: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203969.49234: getting variables 7491 1727203969.49236: in VariableManager get_vars() 7491 1727203969.49293: Calling all_inventory to load vars for managed-node3 7491 1727203969.49296: Calling groups_inventory to load vars for managed-node3 7491 1727203969.49298: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203969.49312: Calling all_plugins_play to load vars for managed-node3 7491 1727203969.49315: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203969.49320: Calling groups_plugins_play to load vars for managed-node3 7491 1727203969.49562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203969.49926: done with get_vars() 7491 1727203969.49939: done getting variables 7491 1727203969.49971: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000080c 7491 1727203969.49974: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:52:49 -0400 (0:00:00.038) 0:00:11.425 ***** 7491 1727203969.50173: entering _queue_task() for managed-node3/service_facts 7491 1727203969.50175: Creating lock for service_facts 7491 1727203969.50527: worker is 1 (out of 1 available) 7491 1727203969.50538: exiting _queue_task() for managed-node3/service_facts 7491 1727203969.50552: done queuing things up, now waiting for results queue to drain 7491 1727203969.50554: waiting for pending results... 7491 1727203969.50851: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 7491 1727203969.51021: in run() - task 0affcd87-79f5-0a4a-ad01-00000000080e 7491 1727203969.51042: variable 'ansible_search_path' from source: unknown 7491 1727203969.51050: variable 'ansible_search_path' from source: unknown 7491 1727203969.51092: calling self._execute() 7491 1727203969.51190: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203969.51202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203969.51227: variable 'omit' from source: magic vars 7491 1727203969.51613: variable 'ansible_distribution_major_version' from source: facts 7491 1727203969.51634: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203969.51647: variable 'omit' from source: magic vars 7491 1727203969.51737: variable 'omit' from source: magic vars 7491 1727203969.51788: variable 'omit' from source: magic vars 7491 1727203969.51837: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203969.51885: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203969.51914: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203969.51939: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203969.51956: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203969.51999: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203969.52008: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203969.52019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203969.52137: Set connection var ansible_timeout to 10 7491 1727203969.52151: Set connection var ansible_pipelining to False 7491 1727203969.52161: Set connection var ansible_shell_type to sh 7491 1727203969.52176: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203969.52189: Set connection var ansible_shell_executable to /bin/sh 7491 1727203969.52205: Set connection var ansible_connection to ssh 7491 1727203969.52238: variable 'ansible_shell_executable' from source: unknown 7491 1727203969.52246: variable 'ansible_connection' from source: unknown 7491 1727203969.52253: variable 'ansible_module_compression' from source: unknown 7491 1727203969.52260: variable 'ansible_shell_type' from source: unknown 7491 1727203969.52269: variable 'ansible_shell_executable' from source: unknown 7491 1727203969.52276: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203969.52283: variable 'ansible_pipelining' from source: unknown 7491 1727203969.52289: variable 'ansible_timeout' from source: unknown 7491 1727203969.52296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203969.52525: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7491 1727203969.52548: variable 'omit' from source: magic vars 7491 1727203969.52558: starting attempt loop 7491 1727203969.52567: running the handler 7491 1727203969.52585: _low_level_execute_command(): starting 7491 1727203969.52598: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203969.53380: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203969.53400: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203969.53423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203969.53443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203969.53512: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203969.53516: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203969.53519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203969.53597: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203969.53600: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203969.53606: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203969.53651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203969.55253: stdout chunk (state=3): >>>/root <<< 7491 1727203969.55351: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203969.55444: stderr chunk (state=3): >>><<< 7491 1727203969.55457: stdout chunk (state=3): >>><<< 7491 1727203969.55574: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203969.55578: _low_level_execute_command(): starting 7491 1727203969.55581: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203969.554906-8296-224091324774132 `" && echo ansible-tmp-1727203969.554906-8296-224091324774132="` echo /root/.ansible/tmp/ansible-tmp-1727203969.554906-8296-224091324774132 `" ) && sleep 0' 7491 1727203969.56142: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203969.56157: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203969.56179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203969.56198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203969.56249: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203969.56261: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203969.56278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203969.56297: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203969.56309: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203969.56320: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203969.56338: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203969.56352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203969.56372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203969.56385: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203969.56399: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203969.56413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203969.56490: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203969.56510: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203969.56527: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203969.56602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203969.58415: stdout chunk (state=3): >>>ansible-tmp-1727203969.554906-8296-224091324774132=/root/.ansible/tmp/ansible-tmp-1727203969.554906-8296-224091324774132 <<< 7491 1727203969.58514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203969.58626: stderr chunk (state=3): >>><<< 7491 1727203969.58629: stdout chunk (state=3): >>><<< 7491 1727203969.58774: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203969.554906-8296-224091324774132=/root/.ansible/tmp/ansible-tmp-1727203969.554906-8296-224091324774132 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203969.58777: variable 'ansible_module_compression' from source: unknown 7491 1727203969.58780: ANSIBALLZ: Using lock for service_facts 7491 1727203969.58782: ANSIBALLZ: Acquiring lock 7491 1727203969.58784: ANSIBALLZ: Lock acquired: 139674605375840 7491 1727203969.58786: ANSIBALLZ: Creating module 7491 1727203969.72098: ANSIBALLZ: Writing module into payload 7491 1727203969.72180: ANSIBALLZ: Writing module 7491 1727203969.72202: ANSIBALLZ: Renaming module 7491 1727203969.72205: ANSIBALLZ: Done creating module 7491 1727203969.72224: variable 'ansible_facts' from source: unknown 7491 1727203969.72272: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203969.554906-8296-224091324774132/AnsiballZ_service_facts.py 7491 1727203969.72386: Sending initial data 7491 1727203969.72389: Sent initial data (159 bytes) 7491 1727203969.73099: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203969.73103: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203969.73119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203969.73161: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203969.73169: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203969.73172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203969.73239: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203969.73242: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203969.73243: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203969.73286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203969.74974: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203969.75003: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203969.75043: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpyxqbgs6_ /root/.ansible/tmp/ansible-tmp-1727203969.554906-8296-224091324774132/AnsiballZ_service_facts.py <<< 7491 1727203969.75082: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203969.75900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203969.76023: stderr chunk (state=3): >>><<< 7491 1727203969.76027: stdout chunk (state=3): >>><<< 7491 1727203969.76045: done transferring module to remote 7491 1727203969.76055: _low_level_execute_command(): starting 7491 1727203969.76060: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203969.554906-8296-224091324774132/ /root/.ansible/tmp/ansible-tmp-1727203969.554906-8296-224091324774132/AnsiballZ_service_facts.py && sleep 0' 7491 1727203969.76537: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203969.76548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203969.76581: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203969.76601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203969.76611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203969.76661: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203969.76671: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203969.76676: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203969.76736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203969.78430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203969.78489: stderr chunk (state=3): >>><<< 7491 1727203969.78493: stdout chunk (state=3): >>><<< 7491 1727203969.78507: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203969.78510: _low_level_execute_command(): starting 7491 1727203969.78515: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203969.554906-8296-224091324774132/AnsiballZ_service_facts.py && sleep 0' 7491 1727203969.78989: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203969.78993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203969.79033: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203969.79045: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203969.79103: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203969.79112: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203969.79176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203971.10093: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "s<<< 7491 1727203971.10160: stdout chunk (state=3): >>>tate": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, <<< 7491 1727203971.10172: stdout chunk (state=3): >>>"systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 7491 1727203971.11365: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203971.11423: stderr chunk (state=3): >>><<< 7491 1727203971.11426: stdout chunk (state=3): >>><<< 7491 1727203971.11442: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203971.11777: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203969.554906-8296-224091324774132/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203971.11786: _low_level_execute_command(): starting 7491 1727203971.11791: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203969.554906-8296-224091324774132/ > /dev/null 2>&1 && sleep 0' 7491 1727203971.12281: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203971.12287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203971.12393: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203971.12419: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203971.12436: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203971.12509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203971.14268: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203971.14322: stderr chunk (state=3): >>><<< 7491 1727203971.14325: stdout chunk (state=3): >>><<< 7491 1727203971.14341: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203971.14345: handler run complete 7491 1727203971.14447: variable 'ansible_facts' from source: unknown 7491 1727203971.14534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203971.14777: variable 'ansible_facts' from source: unknown 7491 1727203971.14850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203971.14955: attempt loop complete, returning result 7491 1727203971.14958: _execute() done 7491 1727203971.14961: dumping result to json 7491 1727203971.14993: done dumping result, returning 7491 1727203971.15003: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-0a4a-ad01-00000000080e] 7491 1727203971.15008: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000080e 7491 1727203971.15729: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000080e 7491 1727203971.15732: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7491 1727203971.15773: no more pending results, returning what we have 7491 1727203971.15775: results queue empty 7491 1727203971.15776: checking for any_errors_fatal 7491 1727203971.15778: done checking for any_errors_fatal 7491 1727203971.15779: checking for max_fail_percentage 7491 1727203971.15780: done checking for max_fail_percentage 7491 1727203971.15780: checking to see if all hosts have failed and the running result is not ok 7491 1727203971.15781: done checking to see if all hosts have failed 7491 1727203971.15782: getting the remaining hosts for this loop 7491 1727203971.15783: done getting the remaining hosts for this loop 7491 1727203971.15785: getting the next task for host managed-node3 7491 1727203971.15788: done getting next task for host managed-node3 7491 1727203971.15790: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 7491 1727203971.15793: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203971.15800: getting variables 7491 1727203971.15801: in VariableManager get_vars() 7491 1727203971.15829: Calling all_inventory to load vars for managed-node3 7491 1727203971.15831: Calling groups_inventory to load vars for managed-node3 7491 1727203971.15833: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203971.15839: Calling all_plugins_play to load vars for managed-node3 7491 1727203971.15841: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203971.15842: Calling groups_plugins_play to load vars for managed-node3 7491 1727203971.16040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203971.16299: done with get_vars() 7491 1727203971.16308: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:52:51 -0400 (0:00:01.662) 0:00:13.087 ***** 7491 1727203971.16381: entering _queue_task() for managed-node3/package_facts 7491 1727203971.16382: Creating lock for package_facts 7491 1727203971.16584: worker is 1 (out of 1 available) 7491 1727203971.16596: exiting _queue_task() for managed-node3/package_facts 7491 1727203971.16609: done queuing things up, now waiting for results queue to drain 7491 1727203971.16611: waiting for pending results... 7491 1727203971.16790: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 7491 1727203971.16893: in run() - task 0affcd87-79f5-0a4a-ad01-00000000080f 7491 1727203971.16910: variable 'ansible_search_path' from source: unknown 7491 1727203971.16913: variable 'ansible_search_path' from source: unknown 7491 1727203971.16940: calling self._execute() 7491 1727203971.17013: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203971.17031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203971.17045: variable 'omit' from source: magic vars 7491 1727203971.17418: variable 'ansible_distribution_major_version' from source: facts 7491 1727203971.17437: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203971.17450: variable 'omit' from source: magic vars 7491 1727203971.17533: variable 'omit' from source: magic vars 7491 1727203971.17578: variable 'omit' from source: magic vars 7491 1727203971.17625: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203971.17666: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203971.17693: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203971.17715: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203971.17734: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203971.17767: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203971.17776: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203971.17784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203971.17900: Set connection var ansible_timeout to 10 7491 1727203971.17912: Set connection var ansible_pipelining to False 7491 1727203971.17925: Set connection var ansible_shell_type to sh 7491 1727203971.17935: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203971.17946: Set connection var ansible_shell_executable to /bin/sh 7491 1727203971.17955: Set connection var ansible_connection to ssh 7491 1727203971.17985: variable 'ansible_shell_executable' from source: unknown 7491 1727203971.17993: variable 'ansible_connection' from source: unknown 7491 1727203971.18003: variable 'ansible_module_compression' from source: unknown 7491 1727203971.18009: variable 'ansible_shell_type' from source: unknown 7491 1727203971.18021: variable 'ansible_shell_executable' from source: unknown 7491 1727203971.18028: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203971.18037: variable 'ansible_pipelining' from source: unknown 7491 1727203971.18044: variable 'ansible_timeout' from source: unknown 7491 1727203971.18052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203971.18263: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7491 1727203971.18282: variable 'omit' from source: magic vars 7491 1727203971.18291: starting attempt loop 7491 1727203971.18299: running the handler 7491 1727203971.18320: _low_level_execute_command(): starting 7491 1727203971.18332: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203971.18903: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203971.18913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203971.18955: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203971.18959: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203971.18961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203971.19026: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203971.19032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203971.19035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203971.19068: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203971.20774: stdout chunk (state=3): >>>/root <<< 7491 1727203971.20903: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203971.20906: stderr chunk (state=3): >>><<< 7491 1727203971.20908: stdout chunk (state=3): >>><<< 7491 1727203971.21013: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203971.21017: _low_level_execute_command(): starting 7491 1727203971.21019: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203971.2092977-8359-124196896633261 `" && echo ansible-tmp-1727203971.2092977-8359-124196896633261="` echo /root/.ansible/tmp/ansible-tmp-1727203971.2092977-8359-124196896633261 `" ) && sleep 0' 7491 1727203971.21635: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203971.21650: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203971.21676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203971.21697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203971.21742: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203971.21763: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203971.21787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203971.21807: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203971.21820: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203971.21831: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203971.21843: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203971.21856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203971.21876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203971.21915: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203971.21930: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203971.21945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203971.22053: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203971.22081: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203971.22099: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203971.22179: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203971.23950: stdout chunk (state=3): >>>ansible-tmp-1727203971.2092977-8359-124196896633261=/root/.ansible/tmp/ansible-tmp-1727203971.2092977-8359-124196896633261 <<< 7491 1727203971.24058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203971.24159: stderr chunk (state=3): >>><<< 7491 1727203971.24175: stdout chunk (state=3): >>><<< 7491 1727203971.24376: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203971.2092977-8359-124196896633261=/root/.ansible/tmp/ansible-tmp-1727203971.2092977-8359-124196896633261 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203971.24379: variable 'ansible_module_compression' from source: unknown 7491 1727203971.24382: ANSIBALLZ: Using lock for package_facts 7491 1727203971.24385: ANSIBALLZ: Acquiring lock 7491 1727203971.24387: ANSIBALLZ: Lock acquired: 139674600640000 7491 1727203971.24389: ANSIBALLZ: Creating module 7491 1727203971.55012: ANSIBALLZ: Writing module into payload 7491 1727203971.55128: ANSIBALLZ: Writing module 7491 1727203971.55158: ANSIBALLZ: Renaming module 7491 1727203971.55162: ANSIBALLZ: Done creating module 7491 1727203971.55195: variable 'ansible_facts' from source: unknown 7491 1727203971.55332: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203971.2092977-8359-124196896633261/AnsiballZ_package_facts.py 7491 1727203971.55466: Sending initial data 7491 1727203971.55470: Sent initial data (160 bytes) 7491 1727203971.56226: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203971.56231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203971.56268: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203971.56272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203971.56274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203971.56276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203971.56331: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203971.56335: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203971.56337: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203971.56390: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203971.58173: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203971.58209: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203971.58246: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpr62y8vuz /root/.ansible/tmp/ansible-tmp-1727203971.2092977-8359-124196896633261/AnsiballZ_package_facts.py <<< 7491 1727203971.58285: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203971.60098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203971.60290: stderr chunk (state=3): >>><<< 7491 1727203971.60293: stdout chunk (state=3): >>><<< 7491 1727203971.60296: done transferring module to remote 7491 1727203971.60298: _low_level_execute_command(): starting 7491 1727203971.60300: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203971.2092977-8359-124196896633261/ /root/.ansible/tmp/ansible-tmp-1727203971.2092977-8359-124196896633261/AnsiballZ_package_facts.py && sleep 0' 7491 1727203971.61002: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203971.61010: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203971.61027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203971.61048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203971.61087: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203971.61095: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203971.61101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203971.61127: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203971.61130: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203971.61132: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203971.61151: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203971.61154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203971.61178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203971.61181: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203971.61184: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203971.61190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203971.61272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203971.61291: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203971.61300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203971.61375: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203971.63058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203971.63122: stderr chunk (state=3): >>><<< 7491 1727203971.63126: stdout chunk (state=3): >>><<< 7491 1727203971.63137: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203971.63140: _low_level_execute_command(): starting 7491 1727203971.63146: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203971.2092977-8359-124196896633261/AnsiballZ_package_facts.py && sleep 0' 7491 1727203971.63770: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203971.63779: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203971.63794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203971.63803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203971.63842: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203971.63849: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203971.63859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203971.63876: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203971.63884: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203971.63890: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203971.63901: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203971.63907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203971.63922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203971.63926: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203971.63933: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203971.63944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203971.64029: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203971.64036: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203971.64039: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203971.64125: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203972.10050: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lib<<< 7491 1727203972.10125: stdout chunk (state=3): >>>ssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 7491 1727203972.11691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203972.11750: stderr chunk (state=3): >>><<< 7491 1727203972.11753: stdout chunk (state=3): >>><<< 7491 1727203972.11978: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203972.15279: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203971.2092977-8359-124196896633261/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203972.15303: _low_level_execute_command(): starting 7491 1727203972.15307: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203971.2092977-8359-124196896633261/ > /dev/null 2>&1 && sleep 0' 7491 1727203972.15974: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203972.15985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203972.15993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203972.16010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203972.16051: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203972.16057: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203972.16070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203972.16082: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203972.16090: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203972.16101: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203972.16104: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203972.16113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203972.16125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203972.16133: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203972.16139: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203972.16149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203972.16223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203972.16239: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203972.16242: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203972.16320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203972.18129: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203972.18222: stderr chunk (state=3): >>><<< 7491 1727203972.18225: stdout chunk (state=3): >>><<< 7491 1727203972.18240: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203972.18246: handler run complete 7491 1727203972.19406: variable 'ansible_facts' from source: unknown 7491 1727203972.20212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203972.28611: variable 'ansible_facts' from source: unknown 7491 1727203972.29059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203972.29791: attempt loop complete, returning result 7491 1727203972.29797: _execute() done 7491 1727203972.29800: dumping result to json 7491 1727203972.30046: done dumping result, returning 7491 1727203972.30053: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-0a4a-ad01-00000000080f] 7491 1727203972.30056: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000080f 7491 1727203972.32292: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000080f 7491 1727203972.32296: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7491 1727203972.32402: no more pending results, returning what we have 7491 1727203972.32405: results queue empty 7491 1727203972.32406: checking for any_errors_fatal 7491 1727203972.32411: done checking for any_errors_fatal 7491 1727203972.32412: checking for max_fail_percentage 7491 1727203972.32413: done checking for max_fail_percentage 7491 1727203972.32414: checking to see if all hosts have failed and the running result is not ok 7491 1727203972.32415: done checking to see if all hosts have failed 7491 1727203972.32418: getting the remaining hosts for this loop 7491 1727203972.32420: done getting the remaining hosts for this loop 7491 1727203972.32423: getting the next task for host managed-node3 7491 1727203972.32430: done getting next task for host managed-node3 7491 1727203972.32433: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 7491 1727203972.32436: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203972.32447: getting variables 7491 1727203972.32449: in VariableManager get_vars() 7491 1727203972.32495: Calling all_inventory to load vars for managed-node3 7491 1727203972.32498: Calling groups_inventory to load vars for managed-node3 7491 1727203972.32503: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203972.32513: Calling all_plugins_play to load vars for managed-node3 7491 1727203972.32518: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203972.32521: Calling groups_plugins_play to load vars for managed-node3 7491 1727203972.34687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203972.36530: done with get_vars() 7491 1727203972.36571: done getting variables 7491 1727203972.36637: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:52:52 -0400 (0:00:01.202) 0:00:14.290 ***** 7491 1727203972.36674: entering _queue_task() for managed-node3/debug 7491 1727203972.36981: worker is 1 (out of 1 available) 7491 1727203972.36994: exiting _queue_task() for managed-node3/debug 7491 1727203972.37008: done queuing things up, now waiting for results queue to drain 7491 1727203972.37009: waiting for pending results... 7491 1727203972.37300: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 7491 1727203972.37441: in run() - task 0affcd87-79f5-0a4a-ad01-000000000017 7491 1727203972.37468: variable 'ansible_search_path' from source: unknown 7491 1727203972.37478: variable 'ansible_search_path' from source: unknown 7491 1727203972.37525: calling self._execute() 7491 1727203972.37623: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203972.37636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203972.37651: variable 'omit' from source: magic vars 7491 1727203972.38043: variable 'ansible_distribution_major_version' from source: facts 7491 1727203972.38059: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203972.38074: variable 'omit' from source: magic vars 7491 1727203972.38135: variable 'omit' from source: magic vars 7491 1727203972.38236: variable 'network_provider' from source: set_fact 7491 1727203972.38257: variable 'omit' from source: magic vars 7491 1727203972.38305: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203972.38349: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203972.38379: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203972.38402: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203972.38419: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203972.38459: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203972.38474: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203972.38509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203972.38615: Set connection var ansible_timeout to 10 7491 1727203972.38629: Set connection var ansible_pipelining to False 7491 1727203972.38640: Set connection var ansible_shell_type to sh 7491 1727203972.38656: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203972.38672: Set connection var ansible_shell_executable to /bin/sh 7491 1727203972.38683: Set connection var ansible_connection to ssh 7491 1727203972.38711: variable 'ansible_shell_executable' from source: unknown 7491 1727203972.38720: variable 'ansible_connection' from source: unknown 7491 1727203972.38729: variable 'ansible_module_compression' from source: unknown 7491 1727203972.38736: variable 'ansible_shell_type' from source: unknown 7491 1727203972.38743: variable 'ansible_shell_executable' from source: unknown 7491 1727203972.38749: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203972.38761: variable 'ansible_pipelining' from source: unknown 7491 1727203972.38771: variable 'ansible_timeout' from source: unknown 7491 1727203972.38779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203972.38929: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203972.38946: variable 'omit' from source: magic vars 7491 1727203972.38956: starting attempt loop 7491 1727203972.38963: running the handler 7491 1727203972.39014: handler run complete 7491 1727203972.39033: attempt loop complete, returning result 7491 1727203972.39040: _execute() done 7491 1727203972.39046: dumping result to json 7491 1727203972.39052: done dumping result, returning 7491 1727203972.39063: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-0a4a-ad01-000000000017] 7491 1727203972.39077: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000017 7491 1727203972.39182: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000017 ok: [managed-node3] => {} MSG: Using network provider: nm 7491 1727203972.39246: no more pending results, returning what we have 7491 1727203972.39250: results queue empty 7491 1727203972.39251: checking for any_errors_fatal 7491 1727203972.39263: done checking for any_errors_fatal 7491 1727203972.39266: checking for max_fail_percentage 7491 1727203972.39268: done checking for max_fail_percentage 7491 1727203972.39270: checking to see if all hosts have failed and the running result is not ok 7491 1727203972.39271: done checking to see if all hosts have failed 7491 1727203972.39272: getting the remaining hosts for this loop 7491 1727203972.39274: done getting the remaining hosts for this loop 7491 1727203972.39278: getting the next task for host managed-node3 7491 1727203972.39285: done getting next task for host managed-node3 7491 1727203972.39289: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7491 1727203972.39293: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203972.39303: getting variables 7491 1727203972.39305: in VariableManager get_vars() 7491 1727203972.39366: Calling all_inventory to load vars for managed-node3 7491 1727203972.39369: Calling groups_inventory to load vars for managed-node3 7491 1727203972.39372: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203972.39384: Calling all_plugins_play to load vars for managed-node3 7491 1727203972.39387: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203972.39391: Calling groups_plugins_play to load vars for managed-node3 7491 1727203972.40382: WORKER PROCESS EXITING 7491 1727203972.41034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203972.42625: done with get_vars() 7491 1727203972.42655: done getting variables 7491 1727203972.42718: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:52:52 -0400 (0:00:00.060) 0:00:14.351 ***** 7491 1727203972.42753: entering _queue_task() for managed-node3/fail 7491 1727203972.43038: worker is 1 (out of 1 available) 7491 1727203972.43050: exiting _queue_task() for managed-node3/fail 7491 1727203972.43066: done queuing things up, now waiting for results queue to drain 7491 1727203972.43068: waiting for pending results... 7491 1727203972.43344: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7491 1727203972.43492: in run() - task 0affcd87-79f5-0a4a-ad01-000000000018 7491 1727203972.43518: variable 'ansible_search_path' from source: unknown 7491 1727203972.43528: variable 'ansible_search_path' from source: unknown 7491 1727203972.43570: calling self._execute() 7491 1727203972.43666: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203972.43678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203972.43692: variable 'omit' from source: magic vars 7491 1727203972.44061: variable 'ansible_distribution_major_version' from source: facts 7491 1727203972.44081: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203972.44208: variable 'network_state' from source: role '' defaults 7491 1727203972.44223: Evaluated conditional (network_state != {}): False 7491 1727203972.44231: when evaluation is False, skipping this task 7491 1727203972.44238: _execute() done 7491 1727203972.44246: dumping result to json 7491 1727203972.44253: done dumping result, returning 7491 1727203972.44267: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-0a4a-ad01-000000000018] 7491 1727203972.44282: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000018 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7491 1727203972.44435: no more pending results, returning what we have 7491 1727203972.44439: results queue empty 7491 1727203972.44440: checking for any_errors_fatal 7491 1727203972.44447: done checking for any_errors_fatal 7491 1727203972.44448: checking for max_fail_percentage 7491 1727203972.44450: done checking for max_fail_percentage 7491 1727203972.44451: checking to see if all hosts have failed and the running result is not ok 7491 1727203972.44452: done checking to see if all hosts have failed 7491 1727203972.44453: getting the remaining hosts for this loop 7491 1727203972.44454: done getting the remaining hosts for this loop 7491 1727203972.44459: getting the next task for host managed-node3 7491 1727203972.44469: done getting next task for host managed-node3 7491 1727203972.44473: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7491 1727203972.44477: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203972.44494: getting variables 7491 1727203972.44496: in VariableManager get_vars() 7491 1727203972.44555: Calling all_inventory to load vars for managed-node3 7491 1727203972.44558: Calling groups_inventory to load vars for managed-node3 7491 1727203972.44560: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203972.44575: Calling all_plugins_play to load vars for managed-node3 7491 1727203972.44579: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203972.44582: Calling groups_plugins_play to load vars for managed-node3 7491 1727203972.45709: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000018 7491 1727203972.45713: WORKER PROCESS EXITING 7491 1727203972.46220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203972.47150: done with get_vars() 7491 1727203972.47170: done getting variables 7491 1727203972.47217: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:52:52 -0400 (0:00:00.044) 0:00:14.396 ***** 7491 1727203972.47244: entering _queue_task() for managed-node3/fail 7491 1727203972.47465: worker is 1 (out of 1 available) 7491 1727203972.47480: exiting _queue_task() for managed-node3/fail 7491 1727203972.47493: done queuing things up, now waiting for results queue to drain 7491 1727203972.47495: waiting for pending results... 7491 1727203972.47685: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7491 1727203972.47860: in run() - task 0affcd87-79f5-0a4a-ad01-000000000019 7491 1727203972.47903: variable 'ansible_search_path' from source: unknown 7491 1727203972.47919: variable 'ansible_search_path' from source: unknown 7491 1727203972.47964: calling self._execute() 7491 1727203972.48077: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203972.48093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203972.48110: variable 'omit' from source: magic vars 7491 1727203972.48534: variable 'ansible_distribution_major_version' from source: facts 7491 1727203972.48586: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203972.48827: variable 'network_state' from source: role '' defaults 7491 1727203972.48845: Evaluated conditional (network_state != {}): False 7491 1727203972.48853: when evaluation is False, skipping this task 7491 1727203972.48857: _execute() done 7491 1727203972.48859: dumping result to json 7491 1727203972.48862: done dumping result, returning 7491 1727203972.48896: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-0a4a-ad01-000000000019] 7491 1727203972.48899: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000019 7491 1727203972.48986: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000019 7491 1727203972.48989: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7491 1727203972.49059: no more pending results, returning what we have 7491 1727203972.49063: results queue empty 7491 1727203972.49065: checking for any_errors_fatal 7491 1727203972.49073: done checking for any_errors_fatal 7491 1727203972.49074: checking for max_fail_percentage 7491 1727203972.49075: done checking for max_fail_percentage 7491 1727203972.49076: checking to see if all hosts have failed and the running result is not ok 7491 1727203972.49077: done checking to see if all hosts have failed 7491 1727203972.49078: getting the remaining hosts for this loop 7491 1727203972.49080: done getting the remaining hosts for this loop 7491 1727203972.49084: getting the next task for host managed-node3 7491 1727203972.49090: done getting next task for host managed-node3 7491 1727203972.49094: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7491 1727203972.49099: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203972.49113: getting variables 7491 1727203972.49114: in VariableManager get_vars() 7491 1727203972.49161: Calling all_inventory to load vars for managed-node3 7491 1727203972.49165: Calling groups_inventory to load vars for managed-node3 7491 1727203972.49168: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203972.49176: Calling all_plugins_play to load vars for managed-node3 7491 1727203972.49177: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203972.49179: Calling groups_plugins_play to load vars for managed-node3 7491 1727203972.49967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203972.51238: done with get_vars() 7491 1727203972.51262: done getting variables 7491 1727203972.51335: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:52:52 -0400 (0:00:00.041) 0:00:14.437 ***** 7491 1727203972.51376: entering _queue_task() for managed-node3/fail 7491 1727203972.51687: worker is 1 (out of 1 available) 7491 1727203972.51699: exiting _queue_task() for managed-node3/fail 7491 1727203972.51711: done queuing things up, now waiting for results queue to drain 7491 1727203972.51712: waiting for pending results... 7491 1727203972.52013: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7491 1727203972.52169: in run() - task 0affcd87-79f5-0a4a-ad01-00000000001a 7491 1727203972.52194: variable 'ansible_search_path' from source: unknown 7491 1727203972.52202: variable 'ansible_search_path' from source: unknown 7491 1727203972.52247: calling self._execute() 7491 1727203972.52352: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203972.52363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203972.52382: variable 'omit' from source: magic vars 7491 1727203972.52788: variable 'ansible_distribution_major_version' from source: facts 7491 1727203972.52811: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203972.53010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203972.56090: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203972.56186: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203972.56234: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203972.56283: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203972.56319: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203972.56412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203972.56448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203972.56481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203972.56539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203972.56560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203972.56672: variable 'ansible_distribution_major_version' from source: facts 7491 1727203972.56694: Evaluated conditional (ansible_distribution_major_version | int > 9): False 7491 1727203972.56702: when evaluation is False, skipping this task 7491 1727203972.56709: _execute() done 7491 1727203972.56725: dumping result to json 7491 1727203972.56732: done dumping result, returning 7491 1727203972.56743: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-0a4a-ad01-00000000001a] 7491 1727203972.56752: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000001a skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 7491 1727203972.56904: no more pending results, returning what we have 7491 1727203972.56908: results queue empty 7491 1727203972.56909: checking for any_errors_fatal 7491 1727203972.56919: done checking for any_errors_fatal 7491 1727203972.56920: checking for max_fail_percentage 7491 1727203972.56922: done checking for max_fail_percentage 7491 1727203972.56923: checking to see if all hosts have failed and the running result is not ok 7491 1727203972.56925: done checking to see if all hosts have failed 7491 1727203972.56925: getting the remaining hosts for this loop 7491 1727203972.56928: done getting the remaining hosts for this loop 7491 1727203972.56932: getting the next task for host managed-node3 7491 1727203972.56939: done getting next task for host managed-node3 7491 1727203972.56943: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7491 1727203972.56947: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203972.56964: getting variables 7491 1727203972.56966: in VariableManager get_vars() 7491 1727203972.57024: Calling all_inventory to load vars for managed-node3 7491 1727203972.57027: Calling groups_inventory to load vars for managed-node3 7491 1727203972.57030: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203972.57041: Calling all_plugins_play to load vars for managed-node3 7491 1727203972.57044: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203972.57047: Calling groups_plugins_play to load vars for managed-node3 7491 1727203972.58165: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000001a 7491 1727203972.58170: WORKER PROCESS EXITING 7491 1727203972.59019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203972.61025: done with get_vars() 7491 1727203972.61054: done getting variables 7491 1727203972.61189: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:52:52 -0400 (0:00:00.098) 0:00:14.536 ***** 7491 1727203972.61226: entering _queue_task() for managed-node3/dnf 7491 1727203972.61566: worker is 1 (out of 1 available) 7491 1727203972.61579: exiting _queue_task() for managed-node3/dnf 7491 1727203972.61592: done queuing things up, now waiting for results queue to drain 7491 1727203972.61593: waiting for pending results... 7491 1727203972.61939: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7491 1727203972.62119: in run() - task 0affcd87-79f5-0a4a-ad01-00000000001b 7491 1727203972.62141: variable 'ansible_search_path' from source: unknown 7491 1727203972.62149: variable 'ansible_search_path' from source: unknown 7491 1727203972.62229: calling self._execute() 7491 1727203972.62328: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203972.62339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203972.62353: variable 'omit' from source: magic vars 7491 1727203972.62777: variable 'ansible_distribution_major_version' from source: facts 7491 1727203972.62795: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203972.62997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203972.66733: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203972.66830: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203972.66877: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203972.66920: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203972.66954: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203972.67073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203972.67109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203972.67150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203972.67198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203972.67218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203972.67362: variable 'ansible_distribution' from source: facts 7491 1727203972.67375: variable 'ansible_distribution_major_version' from source: facts 7491 1727203972.67396: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 7491 1727203972.67527: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727203972.67676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203972.67711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203972.67743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203972.67795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203972.67820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203972.67866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203972.67899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203972.67934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203972.67981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203972.68003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203972.68056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203972.68086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203972.68136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203972.68184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203972.68205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203972.68476: variable 'network_connections' from source: task vars 7491 1727203972.68525: variable 'interface' from source: play vars 7491 1727203972.68628: variable 'interface' from source: play vars 7491 1727203972.68757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727203972.68972: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727203972.69023: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727203972.69073: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727203972.69112: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727203972.69161: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727203972.69189: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727203972.69237: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203972.69269: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727203972.69339: variable '__network_team_connections_defined' from source: role '' defaults 7491 1727203972.69599: variable 'network_connections' from source: task vars 7491 1727203972.69610: variable 'interface' from source: play vars 7491 1727203972.69685: variable 'interface' from source: play vars 7491 1727203972.69729: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7491 1727203972.69737: when evaluation is False, skipping this task 7491 1727203972.69748: _execute() done 7491 1727203972.69760: dumping result to json 7491 1727203972.69770: done dumping result, returning 7491 1727203972.69781: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-0a4a-ad01-00000000001b] 7491 1727203972.69790: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000001b skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7491 1727203972.69942: no more pending results, returning what we have 7491 1727203972.69946: results queue empty 7491 1727203972.69947: checking for any_errors_fatal 7491 1727203972.69954: done checking for any_errors_fatal 7491 1727203972.69955: checking for max_fail_percentage 7491 1727203972.69956: done checking for max_fail_percentage 7491 1727203972.69957: checking to see if all hosts have failed and the running result is not ok 7491 1727203972.69958: done checking to see if all hosts have failed 7491 1727203972.69959: getting the remaining hosts for this loop 7491 1727203972.69961: done getting the remaining hosts for this loop 7491 1727203972.69967: getting the next task for host managed-node3 7491 1727203972.69974: done getting next task for host managed-node3 7491 1727203972.69978: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7491 1727203972.69981: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203972.69997: getting variables 7491 1727203972.69999: in VariableManager get_vars() 7491 1727203972.70053: Calling all_inventory to load vars for managed-node3 7491 1727203972.70056: Calling groups_inventory to load vars for managed-node3 7491 1727203972.70058: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203972.70071: Calling all_plugins_play to load vars for managed-node3 7491 1727203972.70074: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203972.70077: Calling groups_plugins_play to load vars for managed-node3 7491 1727203972.71184: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000001b 7491 1727203972.71188: WORKER PROCESS EXITING 7491 1727203972.72793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203972.74248: done with get_vars() 7491 1727203972.74277: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7491 1727203972.74410: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:52:52 -0400 (0:00:00.132) 0:00:14.668 ***** 7491 1727203972.74466: entering _queue_task() for managed-node3/yum 7491 1727203972.74468: Creating lock for yum 7491 1727203972.74799: worker is 1 (out of 1 available) 7491 1727203972.74815: exiting _queue_task() for managed-node3/yum 7491 1727203972.74831: done queuing things up, now waiting for results queue to drain 7491 1727203972.74833: waiting for pending results... 7491 1727203972.75120: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7491 1727203972.75259: in run() - task 0affcd87-79f5-0a4a-ad01-00000000001c 7491 1727203972.75284: variable 'ansible_search_path' from source: unknown 7491 1727203972.75292: variable 'ansible_search_path' from source: unknown 7491 1727203972.75338: calling self._execute() 7491 1727203972.75440: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203972.75454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203972.75472: variable 'omit' from source: magic vars 7491 1727203972.75875: variable 'ansible_distribution_major_version' from source: facts 7491 1727203972.75893: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203972.76085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203972.80228: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203972.80328: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203972.80375: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203972.80430: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203972.80494: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203972.80546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203972.80578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203972.80611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203972.80649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203972.80660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203972.80741: variable 'ansible_distribution_major_version' from source: facts 7491 1727203972.80753: Evaluated conditional (ansible_distribution_major_version | int < 8): False 7491 1727203972.80756: when evaluation is False, skipping this task 7491 1727203972.80759: _execute() done 7491 1727203972.80762: dumping result to json 7491 1727203972.80767: done dumping result, returning 7491 1727203972.80775: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-0a4a-ad01-00000000001c] 7491 1727203972.80780: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000001c 7491 1727203972.80876: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000001c 7491 1727203972.80878: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 7491 1727203972.80932: no more pending results, returning what we have 7491 1727203972.80936: results queue empty 7491 1727203972.80937: checking for any_errors_fatal 7491 1727203972.80943: done checking for any_errors_fatal 7491 1727203972.80944: checking for max_fail_percentage 7491 1727203972.80946: done checking for max_fail_percentage 7491 1727203972.80946: checking to see if all hosts have failed and the running result is not ok 7491 1727203972.80948: done checking to see if all hosts have failed 7491 1727203972.80948: getting the remaining hosts for this loop 7491 1727203972.80950: done getting the remaining hosts for this loop 7491 1727203972.80954: getting the next task for host managed-node3 7491 1727203972.80961: done getting next task for host managed-node3 7491 1727203972.80967: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7491 1727203972.80970: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203972.80985: getting variables 7491 1727203972.80988: in VariableManager get_vars() 7491 1727203972.81142: Calling all_inventory to load vars for managed-node3 7491 1727203972.81145: Calling groups_inventory to load vars for managed-node3 7491 1727203972.81148: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203972.81157: Calling all_plugins_play to load vars for managed-node3 7491 1727203972.81160: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203972.81163: Calling groups_plugins_play to load vars for managed-node3 7491 1727203972.83045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203972.84013: done with get_vars() 7491 1727203972.84035: done getting variables 7491 1727203972.84083: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:52:52 -0400 (0:00:00.096) 0:00:14.764 ***** 7491 1727203972.84109: entering _queue_task() for managed-node3/fail 7491 1727203972.84335: worker is 1 (out of 1 available) 7491 1727203972.84348: exiting _queue_task() for managed-node3/fail 7491 1727203972.84360: done queuing things up, now waiting for results queue to drain 7491 1727203972.84362: waiting for pending results... 7491 1727203972.84537: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7491 1727203972.84625: in run() - task 0affcd87-79f5-0a4a-ad01-00000000001d 7491 1727203972.84637: variable 'ansible_search_path' from source: unknown 7491 1727203972.84640: variable 'ansible_search_path' from source: unknown 7491 1727203972.84670: calling self._execute() 7491 1727203972.84742: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203972.84745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203972.84755: variable 'omit' from source: magic vars 7491 1727203972.85023: variable 'ansible_distribution_major_version' from source: facts 7491 1727203972.85031: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203972.85113: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727203972.85247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203972.86868: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203972.87170: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203972.87199: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203972.87229: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203972.87249: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203972.87313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203972.87339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203972.87357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203972.87385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203972.87396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203972.87434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203972.87452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203972.87470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203972.87495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203972.87505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203972.87536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203972.87559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203972.87577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203972.87602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203972.87612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203972.87728: variable 'network_connections' from source: task vars 7491 1727203972.87739: variable 'interface' from source: play vars 7491 1727203972.87799: variable 'interface' from source: play vars 7491 1727203972.87850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727203972.87959: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727203972.87990: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727203972.88013: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727203972.88035: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727203972.88067: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727203972.88084: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727203972.88103: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203972.88120: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727203972.88168: variable '__network_team_connections_defined' from source: role '' defaults 7491 1727203972.91361: variable 'network_connections' from source: task vars 7491 1727203972.91366: variable 'interface' from source: play vars 7491 1727203972.91415: variable 'interface' from source: play vars 7491 1727203972.91443: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7491 1727203972.91447: when evaluation is False, skipping this task 7491 1727203972.91449: _execute() done 7491 1727203972.91452: dumping result to json 7491 1727203972.91455: done dumping result, returning 7491 1727203972.91459: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-0a4a-ad01-00000000001d] 7491 1727203972.91468: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000001d 7491 1727203972.91557: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000001d 7491 1727203972.91559: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7491 1727203972.91630: no more pending results, returning what we have 7491 1727203972.91633: results queue empty 7491 1727203972.91634: checking for any_errors_fatal 7491 1727203972.91639: done checking for any_errors_fatal 7491 1727203972.91640: checking for max_fail_percentage 7491 1727203972.91641: done checking for max_fail_percentage 7491 1727203972.91642: checking to see if all hosts have failed and the running result is not ok 7491 1727203972.91643: done checking to see if all hosts have failed 7491 1727203972.91644: getting the remaining hosts for this loop 7491 1727203972.91645: done getting the remaining hosts for this loop 7491 1727203972.91649: getting the next task for host managed-node3 7491 1727203972.91654: done getting next task for host managed-node3 7491 1727203972.91658: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 7491 1727203972.91660: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203972.91681: getting variables 7491 1727203972.91682: in VariableManager get_vars() 7491 1727203972.91732: Calling all_inventory to load vars for managed-node3 7491 1727203972.91735: Calling groups_inventory to load vars for managed-node3 7491 1727203972.91737: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203972.91747: Calling all_plugins_play to load vars for managed-node3 7491 1727203972.91749: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203972.91751: Calling groups_plugins_play to load vars for managed-node3 7491 1727203972.95256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203972.96157: done with get_vars() 7491 1727203972.96178: done getting variables 7491 1727203972.96213: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:52:52 -0400 (0:00:00.121) 0:00:14.886 ***** 7491 1727203972.96236: entering _queue_task() for managed-node3/package 7491 1727203972.96488: worker is 1 (out of 1 available) 7491 1727203972.96503: exiting _queue_task() for managed-node3/package 7491 1727203972.96517: done queuing things up, now waiting for results queue to drain 7491 1727203972.96519: waiting for pending results... 7491 1727203972.96704: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 7491 1727203972.96805: in run() - task 0affcd87-79f5-0a4a-ad01-00000000001e 7491 1727203972.96817: variable 'ansible_search_path' from source: unknown 7491 1727203972.96821: variable 'ansible_search_path' from source: unknown 7491 1727203972.96853: calling self._execute() 7491 1727203972.96928: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203972.96933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203972.96940: variable 'omit' from source: magic vars 7491 1727203972.97229: variable 'ansible_distribution_major_version' from source: facts 7491 1727203972.97238: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203972.97377: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727203972.97662: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727203972.97715: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727203972.97799: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727203972.97840: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727203972.97956: variable 'network_packages' from source: role '' defaults 7491 1727203972.98069: variable '__network_provider_setup' from source: role '' defaults 7491 1727203972.98084: variable '__network_service_name_default_nm' from source: role '' defaults 7491 1727203972.98152: variable '__network_service_name_default_nm' from source: role '' defaults 7491 1727203972.98169: variable '__network_packages_default_nm' from source: role '' defaults 7491 1727203972.98233: variable '__network_packages_default_nm' from source: role '' defaults 7491 1727203972.98416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203973.00196: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203973.00245: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203973.00273: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203973.00298: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203973.00328: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203973.00387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203973.00408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203973.00427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203973.00458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203973.00471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203973.00502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203973.00521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203973.00537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203973.00568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203973.00578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203973.00721: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7491 1727203973.00807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203973.00824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203973.00841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203973.00868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203973.00879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203973.00948: variable 'ansible_python' from source: facts 7491 1727203973.00970: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7491 1727203973.01037: variable '__network_wpa_supplicant_required' from source: role '' defaults 7491 1727203973.01144: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7491 1727203973.01278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203973.01304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203973.01342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203973.01383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203973.01410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203973.01460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203973.01502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203973.01537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203973.01584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203973.01605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203973.01778: variable 'network_connections' from source: task vars 7491 1727203973.01791: variable 'interface' from source: play vars 7491 1727203973.01908: variable 'interface' from source: play vars 7491 1727203973.01996: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727203973.02031: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727203973.02066: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203973.02112: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727203973.02161: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727203973.02488: variable 'network_connections' from source: task vars 7491 1727203973.02499: variable 'interface' from source: play vars 7491 1727203973.02613: variable 'interface' from source: play vars 7491 1727203973.02687: variable '__network_packages_default_wireless' from source: role '' defaults 7491 1727203973.02777: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727203973.02992: variable 'network_connections' from source: task vars 7491 1727203973.02995: variable 'interface' from source: play vars 7491 1727203973.03043: variable 'interface' from source: play vars 7491 1727203973.03064: variable '__network_packages_default_team' from source: role '' defaults 7491 1727203973.03123: variable '__network_team_connections_defined' from source: role '' defaults 7491 1727203973.03321: variable 'network_connections' from source: task vars 7491 1727203973.03327: variable 'interface' from source: play vars 7491 1727203973.03373: variable 'interface' from source: play vars 7491 1727203973.03423: variable '__network_service_name_default_initscripts' from source: role '' defaults 7491 1727203973.03466: variable '__network_service_name_default_initscripts' from source: role '' defaults 7491 1727203973.03472: variable '__network_packages_default_initscripts' from source: role '' defaults 7491 1727203973.03517: variable '__network_packages_default_initscripts' from source: role '' defaults 7491 1727203973.03661: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7491 1727203973.03966: variable 'network_connections' from source: task vars 7491 1727203973.03972: variable 'interface' from source: play vars 7491 1727203973.04015: variable 'interface' from source: play vars 7491 1727203973.04027: variable 'ansible_distribution' from source: facts 7491 1727203973.04030: variable '__network_rh_distros' from source: role '' defaults 7491 1727203973.04039: variable 'ansible_distribution_major_version' from source: facts 7491 1727203973.04058: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7491 1727203973.04167: variable 'ansible_distribution' from source: facts 7491 1727203973.04172: variable '__network_rh_distros' from source: role '' defaults 7491 1727203973.04175: variable 'ansible_distribution_major_version' from source: facts 7491 1727203973.04185: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7491 1727203973.04293: variable 'ansible_distribution' from source: facts 7491 1727203973.04296: variable '__network_rh_distros' from source: role '' defaults 7491 1727203973.04302: variable 'ansible_distribution_major_version' from source: facts 7491 1727203973.04331: variable 'network_provider' from source: set_fact 7491 1727203973.04342: variable 'ansible_facts' from source: unknown 7491 1727203973.05038: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 7491 1727203973.05041: when evaluation is False, skipping this task 7491 1727203973.05044: _execute() done 7491 1727203973.05046: dumping result to json 7491 1727203973.05048: done dumping result, returning 7491 1727203973.05058: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-0a4a-ad01-00000000001e] 7491 1727203973.05065: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000001e 7491 1727203973.05160: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000001e 7491 1727203973.05163: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 7491 1727203973.05211: no more pending results, returning what we have 7491 1727203973.05215: results queue empty 7491 1727203973.05216: checking for any_errors_fatal 7491 1727203973.05224: done checking for any_errors_fatal 7491 1727203973.05225: checking for max_fail_percentage 7491 1727203973.05226: done checking for max_fail_percentage 7491 1727203973.05227: checking to see if all hosts have failed and the running result is not ok 7491 1727203973.05228: done checking to see if all hosts have failed 7491 1727203973.05229: getting the remaining hosts for this loop 7491 1727203973.05230: done getting the remaining hosts for this loop 7491 1727203973.05234: getting the next task for host managed-node3 7491 1727203973.05241: done getting next task for host managed-node3 7491 1727203973.05245: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7491 1727203973.05247: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203973.05262: getting variables 7491 1727203973.05263: in VariableManager get_vars() 7491 1727203973.05312: Calling all_inventory to load vars for managed-node3 7491 1727203973.05315: Calling groups_inventory to load vars for managed-node3 7491 1727203973.05317: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203973.05327: Calling all_plugins_play to load vars for managed-node3 7491 1727203973.05329: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203973.05332: Calling groups_plugins_play to load vars for managed-node3 7491 1727203973.06738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203973.07701: done with get_vars() 7491 1727203973.07722: done getting variables 7491 1727203973.07767: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:52:53 -0400 (0:00:00.115) 0:00:15.001 ***** 7491 1727203973.07792: entering _queue_task() for managed-node3/package 7491 1727203973.08010: worker is 1 (out of 1 available) 7491 1727203973.08027: exiting _queue_task() for managed-node3/package 7491 1727203973.08040: done queuing things up, now waiting for results queue to drain 7491 1727203973.08041: waiting for pending results... 7491 1727203973.08228: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7491 1727203973.08400: in run() - task 0affcd87-79f5-0a4a-ad01-00000000001f 7491 1727203973.08424: variable 'ansible_search_path' from source: unknown 7491 1727203973.08434: variable 'ansible_search_path' from source: unknown 7491 1727203973.08482: calling self._execute() 7491 1727203973.08586: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203973.08602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203973.08616: variable 'omit' from source: magic vars 7491 1727203973.09034: variable 'ansible_distribution_major_version' from source: facts 7491 1727203973.09051: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203973.09190: variable 'network_state' from source: role '' defaults 7491 1727203973.09207: Evaluated conditional (network_state != {}): False 7491 1727203973.09225: when evaluation is False, skipping this task 7491 1727203973.09237: _execute() done 7491 1727203973.09248: dumping result to json 7491 1727203973.09257: done dumping result, returning 7491 1727203973.09268: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-0a4a-ad01-00000000001f] 7491 1727203973.09280: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000001f 7491 1727203973.09402: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000001f 7491 1727203973.09405: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7491 1727203973.09453: no more pending results, returning what we have 7491 1727203973.09457: results queue empty 7491 1727203973.09458: checking for any_errors_fatal 7491 1727203973.09466: done checking for any_errors_fatal 7491 1727203973.09467: checking for max_fail_percentage 7491 1727203973.09469: done checking for max_fail_percentage 7491 1727203973.09470: checking to see if all hosts have failed and the running result is not ok 7491 1727203973.09471: done checking to see if all hosts have failed 7491 1727203973.09472: getting the remaining hosts for this loop 7491 1727203973.09474: done getting the remaining hosts for this loop 7491 1727203973.09478: getting the next task for host managed-node3 7491 1727203973.09484: done getting next task for host managed-node3 7491 1727203973.09487: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7491 1727203973.09490: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203973.09506: getting variables 7491 1727203973.09508: in VariableManager get_vars() 7491 1727203973.09557: Calling all_inventory to load vars for managed-node3 7491 1727203973.09560: Calling groups_inventory to load vars for managed-node3 7491 1727203973.09562: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203973.09572: Calling all_plugins_play to load vars for managed-node3 7491 1727203973.09575: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203973.09577: Calling groups_plugins_play to load vars for managed-node3 7491 1727203973.11127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203973.12055: done with get_vars() 7491 1727203973.12075: done getting variables 7491 1727203973.12120: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:52:53 -0400 (0:00:00.043) 0:00:15.045 ***** 7491 1727203973.12145: entering _queue_task() for managed-node3/package 7491 1727203973.12371: worker is 1 (out of 1 available) 7491 1727203973.12384: exiting _queue_task() for managed-node3/package 7491 1727203973.12398: done queuing things up, now waiting for results queue to drain 7491 1727203973.12400: waiting for pending results... 7491 1727203973.12586: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7491 1727203973.12702: in run() - task 0affcd87-79f5-0a4a-ad01-000000000020 7491 1727203973.12723: variable 'ansible_search_path' from source: unknown 7491 1727203973.12730: variable 'ansible_search_path' from source: unknown 7491 1727203973.12773: calling self._execute() 7491 1727203973.12901: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203973.12912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203973.12929: variable 'omit' from source: magic vars 7491 1727203973.13303: variable 'ansible_distribution_major_version' from source: facts 7491 1727203973.13321: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203973.13441: variable 'network_state' from source: role '' defaults 7491 1727203973.13456: Evaluated conditional (network_state != {}): False 7491 1727203973.13466: when evaluation is False, skipping this task 7491 1727203973.13474: _execute() done 7491 1727203973.13481: dumping result to json 7491 1727203973.13489: done dumping result, returning 7491 1727203973.13499: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-0a4a-ad01-000000000020] 7491 1727203973.13510: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000020 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7491 1727203973.13663: no more pending results, returning what we have 7491 1727203973.13671: results queue empty 7491 1727203973.13672: checking for any_errors_fatal 7491 1727203973.13682: done checking for any_errors_fatal 7491 1727203973.13682: checking for max_fail_percentage 7491 1727203973.13684: done checking for max_fail_percentage 7491 1727203973.13684: checking to see if all hosts have failed and the running result is not ok 7491 1727203973.13686: done checking to see if all hosts have failed 7491 1727203973.13686: getting the remaining hosts for this loop 7491 1727203973.13688: done getting the remaining hosts for this loop 7491 1727203973.13693: getting the next task for host managed-node3 7491 1727203973.13699: done getting next task for host managed-node3 7491 1727203973.13703: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7491 1727203973.13707: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203973.13728: getting variables 7491 1727203973.13730: in VariableManager get_vars() 7491 1727203973.13789: Calling all_inventory to load vars for managed-node3 7491 1727203973.13792: Calling groups_inventory to load vars for managed-node3 7491 1727203973.13795: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203973.13807: Calling all_plugins_play to load vars for managed-node3 7491 1727203973.13810: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203973.13814: Calling groups_plugins_play to load vars for managed-node3 7491 1727203973.14335: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000020 7491 1727203973.14338: WORKER PROCESS EXITING 7491 1727203973.14825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203973.15869: done with get_vars() 7491 1727203973.15884: done getting variables 7491 1727203973.15960: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:52:53 -0400 (0:00:00.038) 0:00:15.083 ***** 7491 1727203973.15986: entering _queue_task() for managed-node3/service 7491 1727203973.15987: Creating lock for service 7491 1727203973.16202: worker is 1 (out of 1 available) 7491 1727203973.16219: exiting _queue_task() for managed-node3/service 7491 1727203973.16233: done queuing things up, now waiting for results queue to drain 7491 1727203973.16234: waiting for pending results... 7491 1727203973.16408: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7491 1727203973.16497: in run() - task 0affcd87-79f5-0a4a-ad01-000000000021 7491 1727203973.16509: variable 'ansible_search_path' from source: unknown 7491 1727203973.16512: variable 'ansible_search_path' from source: unknown 7491 1727203973.16541: calling self._execute() 7491 1727203973.16618: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203973.16622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203973.16629: variable 'omit' from source: magic vars 7491 1727203973.16896: variable 'ansible_distribution_major_version' from source: facts 7491 1727203973.16908: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203973.16991: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727203973.17124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203973.18703: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203973.18754: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203973.18785: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203973.18811: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203973.18831: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203973.18892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203973.18911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203973.18930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203973.18957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203973.18971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203973.19004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203973.19021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203973.19037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203973.19063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203973.19077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203973.19107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203973.19124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203973.19141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203973.19166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203973.19177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203973.19294: variable 'network_connections' from source: task vars 7491 1727203973.19304: variable 'interface' from source: play vars 7491 1727203973.19357: variable 'interface' from source: play vars 7491 1727203973.19409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727203973.19515: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727203973.19554: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727203973.19577: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727203973.19600: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727203973.19635: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727203973.19650: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727203973.19668: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203973.19686: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727203973.19732: variable '__network_team_connections_defined' from source: role '' defaults 7491 1727203973.19887: variable 'network_connections' from source: task vars 7491 1727203973.19891: variable 'interface' from source: play vars 7491 1727203973.19936: variable 'interface' from source: play vars 7491 1727203973.19962: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7491 1727203973.19967: when evaluation is False, skipping this task 7491 1727203973.19970: _execute() done 7491 1727203973.19973: dumping result to json 7491 1727203973.19975: done dumping result, returning 7491 1727203973.19983: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-0a4a-ad01-000000000021] 7491 1727203973.19988: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000021 7491 1727203973.20079: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000021 7491 1727203973.20093: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7491 1727203973.20137: no more pending results, returning what we have 7491 1727203973.20141: results queue empty 7491 1727203973.20142: checking for any_errors_fatal 7491 1727203973.20150: done checking for any_errors_fatal 7491 1727203973.20151: checking for max_fail_percentage 7491 1727203973.20153: done checking for max_fail_percentage 7491 1727203973.20154: checking to see if all hosts have failed and the running result is not ok 7491 1727203973.20155: done checking to see if all hosts have failed 7491 1727203973.20156: getting the remaining hosts for this loop 7491 1727203973.20158: done getting the remaining hosts for this loop 7491 1727203973.20161: getting the next task for host managed-node3 7491 1727203973.20172: done getting next task for host managed-node3 7491 1727203973.20176: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7491 1727203973.20179: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203973.20197: getting variables 7491 1727203973.20198: in VariableManager get_vars() 7491 1727203973.20245: Calling all_inventory to load vars for managed-node3 7491 1727203973.20247: Calling groups_inventory to load vars for managed-node3 7491 1727203973.20249: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203973.20258: Calling all_plugins_play to load vars for managed-node3 7491 1727203973.20260: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203973.20263: Calling groups_plugins_play to load vars for managed-node3 7491 1727203973.21054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203973.21979: done with get_vars() 7491 1727203973.21996: done getting variables 7491 1727203973.22041: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:52:53 -0400 (0:00:00.060) 0:00:15.144 ***** 7491 1727203973.22067: entering _queue_task() for managed-node3/service 7491 1727203973.22267: worker is 1 (out of 1 available) 7491 1727203973.22283: exiting _queue_task() for managed-node3/service 7491 1727203973.22295: done queuing things up, now waiting for results queue to drain 7491 1727203973.22297: waiting for pending results... 7491 1727203973.22469: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7491 1727203973.22572: in run() - task 0affcd87-79f5-0a4a-ad01-000000000022 7491 1727203973.22585: variable 'ansible_search_path' from source: unknown 7491 1727203973.22589: variable 'ansible_search_path' from source: unknown 7491 1727203973.22612: calling self._execute() 7491 1727203973.22683: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203973.22688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203973.22696: variable 'omit' from source: magic vars 7491 1727203973.22961: variable 'ansible_distribution_major_version' from source: facts 7491 1727203973.22973: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203973.23085: variable 'network_provider' from source: set_fact 7491 1727203973.23089: variable 'network_state' from source: role '' defaults 7491 1727203973.23099: Evaluated conditional (network_provider == "nm" or network_state != {}): True 7491 1727203973.23105: variable 'omit' from source: magic vars 7491 1727203973.23145: variable 'omit' from source: magic vars 7491 1727203973.23166: variable 'network_service_name' from source: role '' defaults 7491 1727203973.23221: variable 'network_service_name' from source: role '' defaults 7491 1727203973.23292: variable '__network_provider_setup' from source: role '' defaults 7491 1727203973.23298: variable '__network_service_name_default_nm' from source: role '' defaults 7491 1727203973.23346: variable '__network_service_name_default_nm' from source: role '' defaults 7491 1727203973.23352: variable '__network_packages_default_nm' from source: role '' defaults 7491 1727203973.23399: variable '__network_packages_default_nm' from source: role '' defaults 7491 1727203973.23550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203973.25118: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203973.25173: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203973.25202: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203973.25229: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203973.25250: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203973.25308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203973.25330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203973.25347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203973.25377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203973.25388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203973.25423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203973.25439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203973.25457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203973.25486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203973.25497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203973.25643: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7491 1727203973.25720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203973.25739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203973.25756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203973.25784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203973.25795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203973.25858: variable 'ansible_python' from source: facts 7491 1727203973.25877: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7491 1727203973.25936: variable '__network_wpa_supplicant_required' from source: role '' defaults 7491 1727203973.25991: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7491 1727203973.26078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203973.26095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203973.26112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203973.26142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203973.26152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203973.26188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203973.26207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203973.26228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203973.26254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203973.26265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203973.26357: variable 'network_connections' from source: task vars 7491 1727203973.26365: variable 'interface' from source: play vars 7491 1727203973.26419: variable 'interface' from source: play vars 7491 1727203973.26496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727203973.26873: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727203973.26911: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727203973.26942: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727203973.26974: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727203973.27018: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727203973.27041: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727203973.27066: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203973.27088: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727203973.27127: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727203973.27308: variable 'network_connections' from source: task vars 7491 1727203973.27311: variable 'interface' from source: play vars 7491 1727203973.27369: variable 'interface' from source: play vars 7491 1727203973.27405: variable '__network_packages_default_wireless' from source: role '' defaults 7491 1727203973.27462: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727203973.27654: variable 'network_connections' from source: task vars 7491 1727203973.27657: variable 'interface' from source: play vars 7491 1727203973.27706: variable 'interface' from source: play vars 7491 1727203973.27729: variable '__network_packages_default_team' from source: role '' defaults 7491 1727203973.27786: variable '__network_team_connections_defined' from source: role '' defaults 7491 1727203973.27971: variable 'network_connections' from source: task vars 7491 1727203973.27979: variable 'interface' from source: play vars 7491 1727203973.28031: variable 'interface' from source: play vars 7491 1727203973.28076: variable '__network_service_name_default_initscripts' from source: role '' defaults 7491 1727203973.28118: variable '__network_service_name_default_initscripts' from source: role '' defaults 7491 1727203973.28126: variable '__network_packages_default_initscripts' from source: role '' defaults 7491 1727203973.28170: variable '__network_packages_default_initscripts' from source: role '' defaults 7491 1727203973.28309: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7491 1727203973.28640: variable 'network_connections' from source: task vars 7491 1727203973.28643: variable 'interface' from source: play vars 7491 1727203973.28686: variable 'interface' from source: play vars 7491 1727203973.28695: variable 'ansible_distribution' from source: facts 7491 1727203973.28698: variable '__network_rh_distros' from source: role '' defaults 7491 1727203973.28704: variable 'ansible_distribution_major_version' from source: facts 7491 1727203973.28723: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7491 1727203973.28840: variable 'ansible_distribution' from source: facts 7491 1727203973.28844: variable '__network_rh_distros' from source: role '' defaults 7491 1727203973.28847: variable 'ansible_distribution_major_version' from source: facts 7491 1727203973.28859: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7491 1727203973.28973: variable 'ansible_distribution' from source: facts 7491 1727203973.28976: variable '__network_rh_distros' from source: role '' defaults 7491 1727203973.28981: variable 'ansible_distribution_major_version' from source: facts 7491 1727203973.29009: variable 'network_provider' from source: set_fact 7491 1727203973.29029: variable 'omit' from source: magic vars 7491 1727203973.29051: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203973.29076: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203973.29092: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203973.29104: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203973.29114: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203973.29138: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203973.29141: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203973.29144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203973.29215: Set connection var ansible_timeout to 10 7491 1727203973.29223: Set connection var ansible_pipelining to False 7491 1727203973.29228: Set connection var ansible_shell_type to sh 7491 1727203973.29233: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203973.29240: Set connection var ansible_shell_executable to /bin/sh 7491 1727203973.29244: Set connection var ansible_connection to ssh 7491 1727203973.29265: variable 'ansible_shell_executable' from source: unknown 7491 1727203973.29268: variable 'ansible_connection' from source: unknown 7491 1727203973.29270: variable 'ansible_module_compression' from source: unknown 7491 1727203973.29272: variable 'ansible_shell_type' from source: unknown 7491 1727203973.29275: variable 'ansible_shell_executable' from source: unknown 7491 1727203973.29277: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203973.29281: variable 'ansible_pipelining' from source: unknown 7491 1727203973.29287: variable 'ansible_timeout' from source: unknown 7491 1727203973.29290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203973.29363: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203973.29374: variable 'omit' from source: magic vars 7491 1727203973.29381: starting attempt loop 7491 1727203973.29384: running the handler 7491 1727203973.29443: variable 'ansible_facts' from source: unknown 7491 1727203973.30131: _low_level_execute_command(): starting 7491 1727203973.30137: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203973.30830: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203973.30836: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203973.30845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203973.30860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203973.30901: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203973.30905: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203973.30913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203973.30937: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203973.30940: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203973.30978: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203973.30982: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203973.30985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203973.31008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203973.31011: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203973.31058: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203973.31074: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203973.31132: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203973.32759: stdout chunk (state=3): >>>/root <<< 7491 1727203973.32873: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203973.33078: stderr chunk (state=3): >>><<< 7491 1727203973.33082: stdout chunk (state=3): >>><<< 7491 1727203973.33086: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203973.33089: _low_level_execute_command(): starting 7491 1727203973.33091: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203973.329807-8462-251158558656637 `" && echo ansible-tmp-1727203973.329807-8462-251158558656637="` echo /root/.ansible/tmp/ansible-tmp-1727203973.329807-8462-251158558656637 `" ) && sleep 0' 7491 1727203973.33945: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203973.33960: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203973.33992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203973.34029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203973.34086: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203973.34103: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203973.34133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203973.34152: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203973.34167: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203973.34186: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203973.34204: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203973.34240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203973.34267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203973.34280: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203973.34291: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203973.34303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203973.34451: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203973.34470: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203973.34486: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203973.34568: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203973.36362: stdout chunk (state=3): >>>ansible-tmp-1727203973.329807-8462-251158558656637=/root/.ansible/tmp/ansible-tmp-1727203973.329807-8462-251158558656637 <<< 7491 1727203973.36476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203973.36535: stderr chunk (state=3): >>><<< 7491 1727203973.36538: stdout chunk (state=3): >>><<< 7491 1727203973.36570: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203973.329807-8462-251158558656637=/root/.ansible/tmp/ansible-tmp-1727203973.329807-8462-251158558656637 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203973.36577: variable 'ansible_module_compression' from source: unknown 7491 1727203973.36631: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 7491 1727203973.36635: ANSIBALLZ: Acquiring lock 7491 1727203973.36638: ANSIBALLZ: Lock acquired: 139674606106048 7491 1727203973.36640: ANSIBALLZ: Creating module 7491 1727203973.60100: ANSIBALLZ: Writing module into payload 7491 1727203973.60239: ANSIBALLZ: Writing module 7491 1727203973.60268: ANSIBALLZ: Renaming module 7491 1727203973.60274: ANSIBALLZ: Done creating module 7491 1727203973.60304: variable 'ansible_facts' from source: unknown 7491 1727203973.60447: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203973.329807-8462-251158558656637/AnsiballZ_systemd.py 7491 1727203973.60579: Sending initial data 7491 1727203973.60582: Sent initial data (153 bytes) 7491 1727203973.61297: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203973.61302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203973.61341: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203973.61344: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203973.61346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203973.61401: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203973.61404: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203973.61406: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203973.61460: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203973.63250: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 7491 1727203973.63259: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 7491 1727203973.63269: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 7491 1727203973.63284: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 7491 1727203973.63290: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203973.63332: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 7491 1727203973.63334: stderr chunk (state=3): >>>debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203973.63375: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpqakp9gl6 /root/.ansible/tmp/ansible-tmp-1727203973.329807-8462-251158558656637/AnsiballZ_systemd.py <<< 7491 1727203973.63412: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203973.65612: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203973.65727: stderr chunk (state=3): >>><<< 7491 1727203973.65731: stdout chunk (state=3): >>><<< 7491 1727203973.65747: done transferring module to remote 7491 1727203973.65758: _low_level_execute_command(): starting 7491 1727203973.65761: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203973.329807-8462-251158558656637/ /root/.ansible/tmp/ansible-tmp-1727203973.329807-8462-251158558656637/AnsiballZ_systemd.py && sleep 0' 7491 1727203973.66233: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203973.66240: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203973.66251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203973.66260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203973.66294: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203973.66303: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203973.66309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203973.66321: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203973.66328: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203973.66333: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203973.66338: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203973.66347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203973.66352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203973.66359: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203973.66366: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203973.66371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203973.66424: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203973.66445: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203973.66449: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203973.66496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203973.68219: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203973.68303: stderr chunk (state=3): >>><<< 7491 1727203973.68311: stdout chunk (state=3): >>><<< 7491 1727203973.68341: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203973.68350: _low_level_execute_command(): starting 7491 1727203973.68353: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203973.329807-8462-251158558656637/AnsiballZ_systemd.py && sleep 0' 7491 1727203973.68819: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203973.68823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203973.68877: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203973.68880: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203973.68882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203973.68936: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203973.68939: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203973.68994: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203973.93869: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "616", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ExecMainStartTimestampMonotonic": "12637094", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "616", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Tue 2024-09-24 14:47:46 EDT] ; stop_time=[n/a] ; pid=616 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Tue 2024-09-24 14:47:46 EDT] ; stop_time=[n/a] ; pid=616 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.sl<<< 7491 1727203973.93902: stdout chunk (state=3): >>>ice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "14974976", "MemoryAvailable": "infinity", "CPUUsageNSec": "105536000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "<<< 7491 1727203973.93906: stdout chunk (state=3): >>>SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.service shutdown.target NetworkManager-wait-online.service multi-user.target network.target", "After": "basic.target dbus.socket system.slice dbus-broker.service network-pre.target systemd-journald.socket cloud-init-local.service sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:47:46 EDT", "StateChangeTimestampMonotonic": "12973041", "InactiveExitTimestamp": "Tue 2024-09-24 14:47:46 EDT", "InactiveExitTimestampMonotonic": "12637298", "ActiveEnterTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ActiveEnterTimestampMonotonic": "12973041", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ConditionTimestampMonotonic": "12630855", "AssertTimestamp": "Tue 2024-09-24 14:47:46 EDT", "AssertTimestampMonotonic": "12630857", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f94263a9def7408cb754f60792d8c658", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 7491 1727203973.95469: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203973.95473: stdout chunk (state=3): >>><<< 7491 1727203973.95480: stderr chunk (state=3): >>><<< 7491 1727203973.95496: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "616", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ExecMainStartTimestampMonotonic": "12637094", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "616", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Tue 2024-09-24 14:47:46 EDT] ; stop_time=[n/a] ; pid=616 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Tue 2024-09-24 14:47:46 EDT] ; stop_time=[n/a] ; pid=616 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "14974976", "MemoryAvailable": "infinity", "CPUUsageNSec": "105536000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.service shutdown.target NetworkManager-wait-online.service multi-user.target network.target", "After": "basic.target dbus.socket system.slice dbus-broker.service network-pre.target systemd-journald.socket cloud-init-local.service sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:47:46 EDT", "StateChangeTimestampMonotonic": "12973041", "InactiveExitTimestamp": "Tue 2024-09-24 14:47:46 EDT", "InactiveExitTimestampMonotonic": "12637298", "ActiveEnterTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ActiveEnterTimestampMonotonic": "12973041", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ConditionTimestampMonotonic": "12630855", "AssertTimestamp": "Tue 2024-09-24 14:47:46 EDT", "AssertTimestampMonotonic": "12630857", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f94263a9def7408cb754f60792d8c658", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203973.95678: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203973.329807-8462-251158558656637/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203973.95693: _low_level_execute_command(): starting 7491 1727203973.95698: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203973.329807-8462-251158558656637/ > /dev/null 2>&1 && sleep 0' 7491 1727203973.96336: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203973.96346: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203973.96356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203973.96372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203973.96412: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203973.96421: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203973.96429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203973.96442: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203973.96450: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203973.96456: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203973.96465: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203973.96478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203973.96489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203973.96499: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203973.96502: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203973.96511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203973.96587: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203973.96606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203973.96614: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203973.96755: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203973.98562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203973.98568: stdout chunk (state=3): >>><<< 7491 1727203973.98575: stderr chunk (state=3): >>><<< 7491 1727203973.98592: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203973.98600: handler run complete 7491 1727203973.98665: attempt loop complete, returning result 7491 1727203973.98669: _execute() done 7491 1727203973.98671: dumping result to json 7491 1727203973.98689: done dumping result, returning 7491 1727203973.98699: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-0a4a-ad01-000000000022] 7491 1727203973.98705: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000022 ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7491 1727203973.99046: no more pending results, returning what we have 7491 1727203973.99050: results queue empty 7491 1727203973.99051: checking for any_errors_fatal 7491 1727203973.99057: done checking for any_errors_fatal 7491 1727203973.99058: checking for max_fail_percentage 7491 1727203973.99059: done checking for max_fail_percentage 7491 1727203973.99060: checking to see if all hosts have failed and the running result is not ok 7491 1727203973.99061: done checking to see if all hosts have failed 7491 1727203973.99062: getting the remaining hosts for this loop 7491 1727203973.99068: done getting the remaining hosts for this loop 7491 1727203973.99073: getting the next task for host managed-node3 7491 1727203973.99079: done getting next task for host managed-node3 7491 1727203973.99083: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7491 1727203973.99086: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203973.99096: getting variables 7491 1727203973.99098: in VariableManager get_vars() 7491 1727203973.99144: Calling all_inventory to load vars for managed-node3 7491 1727203973.99147: Calling groups_inventory to load vars for managed-node3 7491 1727203973.99150: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203973.99160: Calling all_plugins_play to load vars for managed-node3 7491 1727203973.99162: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203973.99167: Calling groups_plugins_play to load vars for managed-node3 7491 1727203974.00748: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000022 7491 1727203974.00752: WORKER PROCESS EXITING 7491 1727203974.01120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203974.02726: done with get_vars() 7491 1727203974.02754: done getting variables 7491 1727203974.02816: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:52:54 -0400 (0:00:00.807) 0:00:15.952 ***** 7491 1727203974.02856: entering _queue_task() for managed-node3/service 7491 1727203974.03145: worker is 1 (out of 1 available) 7491 1727203974.03159: exiting _queue_task() for managed-node3/service 7491 1727203974.03175: done queuing things up, now waiting for results queue to drain 7491 1727203974.03177: waiting for pending results... 7491 1727203974.03476: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7491 1727203974.03636: in run() - task 0affcd87-79f5-0a4a-ad01-000000000023 7491 1727203974.03658: variable 'ansible_search_path' from source: unknown 7491 1727203974.03669: variable 'ansible_search_path' from source: unknown 7491 1727203974.03712: calling self._execute() 7491 1727203974.03820: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203974.03836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203974.03852: variable 'omit' from source: magic vars 7491 1727203974.04251: variable 'ansible_distribution_major_version' from source: facts 7491 1727203974.04276: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203974.04403: variable 'network_provider' from source: set_fact 7491 1727203974.04416: Evaluated conditional (network_provider == "nm"): True 7491 1727203974.04516: variable '__network_wpa_supplicant_required' from source: role '' defaults 7491 1727203974.04612: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7491 1727203974.04791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203974.07087: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203974.07158: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203974.07208: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203974.07247: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203974.07282: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203974.07381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203974.07415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203974.07451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203974.07500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203974.07522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203974.07578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203974.07606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203974.07637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203974.07684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203974.07705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203974.07754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203974.07785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203974.07815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203974.07863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203974.07884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203974.08033: variable 'network_connections' from source: task vars 7491 1727203974.08052: variable 'interface' from source: play vars 7491 1727203974.08135: variable 'interface' from source: play vars 7491 1727203974.08222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727203974.08393: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727203974.08442: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727203974.08481: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727203974.08521: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727203974.08570: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727203974.08597: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727203974.08632: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203974.08665: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727203974.08718: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727203974.08985: variable 'network_connections' from source: task vars 7491 1727203974.08996: variable 'interface' from source: play vars 7491 1727203974.09068: variable 'interface' from source: play vars 7491 1727203974.09117: Evaluated conditional (__network_wpa_supplicant_required): False 7491 1727203974.09125: when evaluation is False, skipping this task 7491 1727203974.09131: _execute() done 7491 1727203974.09139: dumping result to json 7491 1727203974.09146: done dumping result, returning 7491 1727203974.09157: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-0a4a-ad01-000000000023] 7491 1727203974.09184: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000023 skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 7491 1727203974.09333: no more pending results, returning what we have 7491 1727203974.09337: results queue empty 7491 1727203974.09338: checking for any_errors_fatal 7491 1727203974.09360: done checking for any_errors_fatal 7491 1727203974.09362: checking for max_fail_percentage 7491 1727203974.09367: done checking for max_fail_percentage 7491 1727203974.09368: checking to see if all hosts have failed and the running result is not ok 7491 1727203974.09369: done checking to see if all hosts have failed 7491 1727203974.09370: getting the remaining hosts for this loop 7491 1727203974.09372: done getting the remaining hosts for this loop 7491 1727203974.09376: getting the next task for host managed-node3 7491 1727203974.09384: done getting next task for host managed-node3 7491 1727203974.09388: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 7491 1727203974.09391: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203974.09405: getting variables 7491 1727203974.09407: in VariableManager get_vars() 7491 1727203974.09461: Calling all_inventory to load vars for managed-node3 7491 1727203974.09466: Calling groups_inventory to load vars for managed-node3 7491 1727203974.09469: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203974.09481: Calling all_plugins_play to load vars for managed-node3 7491 1727203974.09484: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203974.09487: Calling groups_plugins_play to load vars for managed-node3 7491 1727203974.10587: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000023 7491 1727203974.10591: WORKER PROCESS EXITING 7491 1727203974.11148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203974.12799: done with get_vars() 7491 1727203974.12834: done getting variables 7491 1727203974.12899: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:52:54 -0400 (0:00:00.100) 0:00:16.053 ***** 7491 1727203974.12935: entering _queue_task() for managed-node3/service 7491 1727203974.13242: worker is 1 (out of 1 available) 7491 1727203974.13255: exiting _queue_task() for managed-node3/service 7491 1727203974.13273: done queuing things up, now waiting for results queue to drain 7491 1727203974.13274: waiting for pending results... 7491 1727203974.13571: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 7491 1727203974.13734: in run() - task 0affcd87-79f5-0a4a-ad01-000000000024 7491 1727203974.13756: variable 'ansible_search_path' from source: unknown 7491 1727203974.13766: variable 'ansible_search_path' from source: unknown 7491 1727203974.13809: calling self._execute() 7491 1727203974.13914: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203974.13926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203974.13947: variable 'omit' from source: magic vars 7491 1727203974.14330: variable 'ansible_distribution_major_version' from source: facts 7491 1727203974.14349: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203974.14477: variable 'network_provider' from source: set_fact 7491 1727203974.14491: Evaluated conditional (network_provider == "initscripts"): False 7491 1727203974.14499: when evaluation is False, skipping this task 7491 1727203974.14506: _execute() done 7491 1727203974.14516: dumping result to json 7491 1727203974.14525: done dumping result, returning 7491 1727203974.14535: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-0a4a-ad01-000000000024] 7491 1727203974.14547: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000024 skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7491 1727203974.14698: no more pending results, returning what we have 7491 1727203974.14702: results queue empty 7491 1727203974.14704: checking for any_errors_fatal 7491 1727203974.14714: done checking for any_errors_fatal 7491 1727203974.14715: checking for max_fail_percentage 7491 1727203974.14717: done checking for max_fail_percentage 7491 1727203974.14718: checking to see if all hosts have failed and the running result is not ok 7491 1727203974.14720: done checking to see if all hosts have failed 7491 1727203974.14720: getting the remaining hosts for this loop 7491 1727203974.14722: done getting the remaining hosts for this loop 7491 1727203974.14727: getting the next task for host managed-node3 7491 1727203974.14734: done getting next task for host managed-node3 7491 1727203974.14738: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7491 1727203974.14742: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203974.14758: getting variables 7491 1727203974.14760: in VariableManager get_vars() 7491 1727203974.14820: Calling all_inventory to load vars for managed-node3 7491 1727203974.14823: Calling groups_inventory to load vars for managed-node3 7491 1727203974.14826: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203974.14839: Calling all_plugins_play to load vars for managed-node3 7491 1727203974.14842: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203974.14846: Calling groups_plugins_play to load vars for managed-node3 7491 1727203974.15887: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000024 7491 1727203974.15891: WORKER PROCESS EXITING 7491 1727203974.16604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203974.18199: done with get_vars() 7491 1727203974.18225: done getting variables 7491 1727203974.18290: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:52:54 -0400 (0:00:00.053) 0:00:16.107 ***** 7491 1727203974.18326: entering _queue_task() for managed-node3/copy 7491 1727203974.18611: worker is 1 (out of 1 available) 7491 1727203974.18624: exiting _queue_task() for managed-node3/copy 7491 1727203974.18637: done queuing things up, now waiting for results queue to drain 7491 1727203974.18638: waiting for pending results... 7491 1727203974.18920: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7491 1727203974.19068: in run() - task 0affcd87-79f5-0a4a-ad01-000000000025 7491 1727203974.19092: variable 'ansible_search_path' from source: unknown 7491 1727203974.19101: variable 'ansible_search_path' from source: unknown 7491 1727203974.19142: calling self._execute() 7491 1727203974.19240: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203974.19252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203974.19267: variable 'omit' from source: magic vars 7491 1727203974.19639: variable 'ansible_distribution_major_version' from source: facts 7491 1727203974.19656: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203974.19781: variable 'network_provider' from source: set_fact 7491 1727203974.19794: Evaluated conditional (network_provider == "initscripts"): False 7491 1727203974.19801: when evaluation is False, skipping this task 7491 1727203974.19809: _execute() done 7491 1727203974.19816: dumping result to json 7491 1727203974.19824: done dumping result, returning 7491 1727203974.19839: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-0a4a-ad01-000000000025] 7491 1727203974.19853: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000025 skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 7491 1727203974.20010: no more pending results, returning what we have 7491 1727203974.20014: results queue empty 7491 1727203974.20015: checking for any_errors_fatal 7491 1727203974.20024: done checking for any_errors_fatal 7491 1727203974.20025: checking for max_fail_percentage 7491 1727203974.20027: done checking for max_fail_percentage 7491 1727203974.20028: checking to see if all hosts have failed and the running result is not ok 7491 1727203974.20030: done checking to see if all hosts have failed 7491 1727203974.20030: getting the remaining hosts for this loop 7491 1727203974.20033: done getting the remaining hosts for this loop 7491 1727203974.20037: getting the next task for host managed-node3 7491 1727203974.20043: done getting next task for host managed-node3 7491 1727203974.20047: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7491 1727203974.20051: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203974.20068: getting variables 7491 1727203974.20070: in VariableManager get_vars() 7491 1727203974.20124: Calling all_inventory to load vars for managed-node3 7491 1727203974.20127: Calling groups_inventory to load vars for managed-node3 7491 1727203974.20130: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203974.20143: Calling all_plugins_play to load vars for managed-node3 7491 1727203974.20146: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203974.20150: Calling groups_plugins_play to load vars for managed-node3 7491 1727203974.21184: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000025 7491 1727203974.21188: WORKER PROCESS EXITING 7491 1727203974.21765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203974.23477: done with get_vars() 7491 1727203974.23500: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:52:54 -0400 (0:00:00.052) 0:00:16.159 ***** 7491 1727203974.23593: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 7491 1727203974.23596: Creating lock for fedora.linux_system_roles.network_connections 7491 1727203974.23878: worker is 1 (out of 1 available) 7491 1727203974.23890: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 7491 1727203974.23903: done queuing things up, now waiting for results queue to drain 7491 1727203974.23904: waiting for pending results... 7491 1727203974.24180: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7491 1727203974.24318: in run() - task 0affcd87-79f5-0a4a-ad01-000000000026 7491 1727203974.24340: variable 'ansible_search_path' from source: unknown 7491 1727203974.24348: variable 'ansible_search_path' from source: unknown 7491 1727203974.24386: calling self._execute() 7491 1727203974.24485: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203974.24497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203974.24512: variable 'omit' from source: magic vars 7491 1727203974.24896: variable 'ansible_distribution_major_version' from source: facts 7491 1727203974.24913: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203974.24925: variable 'omit' from source: magic vars 7491 1727203974.24989: variable 'omit' from source: magic vars 7491 1727203974.25159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203974.27461: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203974.27540: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203974.27585: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203974.27629: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203974.27660: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203974.27748: variable 'network_provider' from source: set_fact 7491 1727203974.27894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203974.27949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203974.27984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203974.28037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203974.28057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203974.28136: variable 'omit' from source: magic vars 7491 1727203974.28262: variable 'omit' from source: magic vars 7491 1727203974.28374: variable 'network_connections' from source: task vars 7491 1727203974.28391: variable 'interface' from source: play vars 7491 1727203974.28457: variable 'interface' from source: play vars 7491 1727203974.28620: variable 'omit' from source: magic vars 7491 1727203974.28634: variable '__lsr_ansible_managed' from source: task vars 7491 1727203974.28703: variable '__lsr_ansible_managed' from source: task vars 7491 1727203974.28969: Loaded config def from plugin (lookup/template) 7491 1727203974.28979: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 7491 1727203974.29009: File lookup term: get_ansible_managed.j2 7491 1727203974.29020: variable 'ansible_search_path' from source: unknown 7491 1727203974.29030: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 7491 1727203974.29048: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 7491 1727203974.29074: variable 'ansible_search_path' from source: unknown 7491 1727203974.35267: variable 'ansible_managed' from source: unknown 7491 1727203974.35405: variable 'omit' from source: magic vars 7491 1727203974.35438: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203974.35473: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203974.35501: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203974.35528: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203974.35543: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203974.35576: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203974.35585: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203974.35593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203974.35696: Set connection var ansible_timeout to 10 7491 1727203974.35709: Set connection var ansible_pipelining to False 7491 1727203974.35721: Set connection var ansible_shell_type to sh 7491 1727203974.35734: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203974.35747: Set connection var ansible_shell_executable to /bin/sh 7491 1727203974.35756: Set connection var ansible_connection to ssh 7491 1727203974.35785: variable 'ansible_shell_executable' from source: unknown 7491 1727203974.35794: variable 'ansible_connection' from source: unknown 7491 1727203974.35801: variable 'ansible_module_compression' from source: unknown 7491 1727203974.35808: variable 'ansible_shell_type' from source: unknown 7491 1727203974.35815: variable 'ansible_shell_executable' from source: unknown 7491 1727203974.35822: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203974.35831: variable 'ansible_pipelining' from source: unknown 7491 1727203974.35840: variable 'ansible_timeout' from source: unknown 7491 1727203974.35848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203974.35984: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7491 1727203974.36010: variable 'omit' from source: magic vars 7491 1727203974.36022: starting attempt loop 7491 1727203974.36030: running the handler 7491 1727203974.36049: _low_level_execute_command(): starting 7491 1727203974.36063: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203974.36839: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203974.36853: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203974.36871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203974.36891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203974.36940: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203974.36953: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203974.36968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203974.36986: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203974.36996: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203974.37004: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203974.37014: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203974.37028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203974.37044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203974.37055: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203974.37067: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203974.37081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203974.37159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203974.37178: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203974.37192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203974.37270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203974.38895: stdout chunk (state=3): >>>/root <<< 7491 1727203974.39089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203974.39093: stdout chunk (state=3): >>><<< 7491 1727203974.39095: stderr chunk (state=3): >>><<< 7491 1727203974.39204: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203974.39209: _low_level_execute_command(): starting 7491 1727203974.39211: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203974.3911607-8499-154189691285579 `" && echo ansible-tmp-1727203974.3911607-8499-154189691285579="` echo /root/.ansible/tmp/ansible-tmp-1727203974.3911607-8499-154189691285579 `" ) && sleep 0' 7491 1727203974.40404: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203974.40408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203974.40449: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203974.40452: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203974.40455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203974.40514: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203974.40535: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203974.40538: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203974.40593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203974.42418: stdout chunk (state=3): >>>ansible-tmp-1727203974.3911607-8499-154189691285579=/root/.ansible/tmp/ansible-tmp-1727203974.3911607-8499-154189691285579 <<< 7491 1727203974.42536: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203974.42615: stderr chunk (state=3): >>><<< 7491 1727203974.42619: stdout chunk (state=3): >>><<< 7491 1727203974.42774: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203974.3911607-8499-154189691285579=/root/.ansible/tmp/ansible-tmp-1727203974.3911607-8499-154189691285579 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203974.42778: variable 'ansible_module_compression' from source: unknown 7491 1727203974.42780: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 7491 1727203974.42782: ANSIBALLZ: Acquiring lock 7491 1727203974.42784: ANSIBALLZ: Lock acquired: 139674600736768 7491 1727203974.42786: ANSIBALLZ: Creating module 7491 1727203974.79398: ANSIBALLZ: Writing module into payload 7491 1727203974.79891: ANSIBALLZ: Writing module 7491 1727203974.79924: ANSIBALLZ: Renaming module 7491 1727203974.79928: ANSIBALLZ: Done creating module 7491 1727203974.79957: variable 'ansible_facts' from source: unknown 7491 1727203974.80066: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203974.3911607-8499-154189691285579/AnsiballZ_network_connections.py 7491 1727203974.80228: Sending initial data 7491 1727203974.80232: Sent initial data (166 bytes) 7491 1727203974.81338: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203974.81348: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203974.81360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203974.81379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203974.81425: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203974.81432: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203974.81442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203974.81457: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203974.81467: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203974.81476: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203974.81482: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203974.81492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203974.81509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203974.81516: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203974.81526: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203974.81535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203974.81612: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203974.81636: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203974.81650: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203974.81729: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203974.83526: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203974.83556: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203974.83597: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmp_i8x_pz3 /root/.ansible/tmp/ansible-tmp-1727203974.3911607-8499-154189691285579/AnsiballZ_network_connections.py <<< 7491 1727203974.83630: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203974.85691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203974.85876: stderr chunk (state=3): >>><<< 7491 1727203974.85880: stdout chunk (state=3): >>><<< 7491 1727203974.85882: done transferring module to remote 7491 1727203974.85885: _low_level_execute_command(): starting 7491 1727203974.85887: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203974.3911607-8499-154189691285579/ /root/.ansible/tmp/ansible-tmp-1727203974.3911607-8499-154189691285579/AnsiballZ_network_connections.py && sleep 0' 7491 1727203974.87033: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203974.87037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203974.87077: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203974.87081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203974.87084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203974.87140: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203974.87787: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203974.87791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203974.87849: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203974.89539: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203974.89622: stderr chunk (state=3): >>><<< 7491 1727203974.89626: stdout chunk (state=3): >>><<< 7491 1727203974.89671: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203974.89677: _low_level_execute_command(): starting 7491 1727203974.89680: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203974.3911607-8499-154189691285579/AnsiballZ_network_connections.py && sleep 0' 7491 1727203974.90358: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203974.90377: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203974.90395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203974.90425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203974.90474: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203974.90487: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203974.90502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203974.90525: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203974.90536: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203974.90545: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203974.90560: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203974.90575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203974.90589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203974.90600: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203974.90609: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203974.90627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203974.90703: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203974.90720: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203974.90740: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203974.90823: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203975.20488: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, e01b0787-1873-4334-a8a8-27f8e63061d2\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, e01b0787-1873-4334-a8a8-27f8e63061d2 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": true, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1", "route_metric4": 65535}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": true, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1", "route_metric4": 65535}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 7491 1727203975.22443: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203975.22501: stderr chunk (state=3): >>><<< 7491 1727203975.22504: stdout chunk (state=3): >>><<< 7491 1727203975.22525: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, e01b0787-1873-4334-a8a8-27f8e63061d2\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, e01b0787-1873-4334-a8a8-27f8e63061d2 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": true, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1", "route_metric4": 65535}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": true, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1", "route_metric4": 65535}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203975.22566: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'type': 'ethernet', 'state': 'up', 'ip': {'auto_gateway': True, 'dhcp4': False, 'auto6': False, 'address': ['2001:db8::2/64', '203.0.113.2/24'], 'gateway6': '2001:db8::1', 'gateway4': '203.0.113.1', 'route_metric4': 65535}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203974.3911607-8499-154189691285579/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203975.22574: _low_level_execute_command(): starting 7491 1727203975.22579: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203974.3911607-8499-154189691285579/ > /dev/null 2>&1 && sleep 0' 7491 1727203975.23053: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203975.23057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203975.23097: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203975.23110: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203975.23162: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203975.23182: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203975.23229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203975.25022: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203975.25083: stderr chunk (state=3): >>><<< 7491 1727203975.25088: stdout chunk (state=3): >>><<< 7491 1727203975.25105: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203975.25111: handler run complete 7491 1727203975.25144: attempt loop complete, returning result 7491 1727203975.25147: _execute() done 7491 1727203975.25149: dumping result to json 7491 1727203975.25153: done dumping result, returning 7491 1727203975.25162: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-0a4a-ad01-000000000026] 7491 1727203975.25167: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000026 7491 1727203975.25279: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000026 7491 1727203975.25282: WORKER PROCESS EXITING changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/64", "203.0.113.2/24" ], "auto6": false, "auto_gateway": true, "dhcp4": false, "gateway4": "203.0.113.1", "gateway6": "2001:db8::1", "route_metric4": 65535 }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'veth0': add connection veth0, e01b0787-1873-4334-a8a8-27f8e63061d2 [004] #0, state:up persistent_state:present, 'veth0': up connection veth0, e01b0787-1873-4334-a8a8-27f8e63061d2 (not-active) 7491 1727203975.25398: no more pending results, returning what we have 7491 1727203975.25402: results queue empty 7491 1727203975.25403: checking for any_errors_fatal 7491 1727203975.25411: done checking for any_errors_fatal 7491 1727203975.25412: checking for max_fail_percentage 7491 1727203975.25414: done checking for max_fail_percentage 7491 1727203975.25415: checking to see if all hosts have failed and the running result is not ok 7491 1727203975.25415: done checking to see if all hosts have failed 7491 1727203975.25419: getting the remaining hosts for this loop 7491 1727203975.25420: done getting the remaining hosts for this loop 7491 1727203975.25424: getting the next task for host managed-node3 7491 1727203975.25430: done getting next task for host managed-node3 7491 1727203975.25433: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 7491 1727203975.25436: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203975.25446: getting variables 7491 1727203975.25448: in VariableManager get_vars() 7491 1727203975.25502: Calling all_inventory to load vars for managed-node3 7491 1727203975.25506: Calling groups_inventory to load vars for managed-node3 7491 1727203975.25508: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203975.25520: Calling all_plugins_play to load vars for managed-node3 7491 1727203975.25523: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203975.25526: Calling groups_plugins_play to load vars for managed-node3 7491 1727203975.26355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203975.27287: done with get_vars() 7491 1727203975.27309: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:52:55 -0400 (0:00:01.037) 0:00:17.197 ***** 7491 1727203975.27380: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 7491 1727203975.27382: Creating lock for fedora.linux_system_roles.network_state 7491 1727203975.27619: worker is 1 (out of 1 available) 7491 1727203975.27633: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 7491 1727203975.27646: done queuing things up, now waiting for results queue to drain 7491 1727203975.27647: waiting for pending results... 7491 1727203975.27830: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 7491 1727203975.27926: in run() - task 0affcd87-79f5-0a4a-ad01-000000000027 7491 1727203975.27937: variable 'ansible_search_path' from source: unknown 7491 1727203975.27941: variable 'ansible_search_path' from source: unknown 7491 1727203975.27973: calling self._execute() 7491 1727203975.28045: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203975.28049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203975.28057: variable 'omit' from source: magic vars 7491 1727203975.28334: variable 'ansible_distribution_major_version' from source: facts 7491 1727203975.28344: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203975.28434: variable 'network_state' from source: role '' defaults 7491 1727203975.28441: Evaluated conditional (network_state != {}): False 7491 1727203975.28446: when evaluation is False, skipping this task 7491 1727203975.28449: _execute() done 7491 1727203975.28452: dumping result to json 7491 1727203975.28454: done dumping result, returning 7491 1727203975.28462: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-0a4a-ad01-000000000027] 7491 1727203975.28471: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000027 7491 1727203975.28557: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000027 7491 1727203975.28560: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7491 1727203975.28610: no more pending results, returning what we have 7491 1727203975.28613: results queue empty 7491 1727203975.28614: checking for any_errors_fatal 7491 1727203975.28625: done checking for any_errors_fatal 7491 1727203975.28626: checking for max_fail_percentage 7491 1727203975.28627: done checking for max_fail_percentage 7491 1727203975.28628: checking to see if all hosts have failed and the running result is not ok 7491 1727203975.28630: done checking to see if all hosts have failed 7491 1727203975.28630: getting the remaining hosts for this loop 7491 1727203975.28632: done getting the remaining hosts for this loop 7491 1727203975.28636: getting the next task for host managed-node3 7491 1727203975.28643: done getting next task for host managed-node3 7491 1727203975.28646: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7491 1727203975.28650: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203975.28668: getting variables 7491 1727203975.28670: in VariableManager get_vars() 7491 1727203975.28715: Calling all_inventory to load vars for managed-node3 7491 1727203975.28717: Calling groups_inventory to load vars for managed-node3 7491 1727203975.28719: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203975.28728: Calling all_plugins_play to load vars for managed-node3 7491 1727203975.28730: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203975.28733: Calling groups_plugins_play to load vars for managed-node3 7491 1727203975.29646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203975.30559: done with get_vars() 7491 1727203975.30581: done getting variables 7491 1727203975.30630: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:52:55 -0400 (0:00:00.032) 0:00:17.230 ***** 7491 1727203975.30656: entering _queue_task() for managed-node3/debug 7491 1727203975.30890: worker is 1 (out of 1 available) 7491 1727203975.30904: exiting _queue_task() for managed-node3/debug 7491 1727203975.30916: done queuing things up, now waiting for results queue to drain 7491 1727203975.30918: waiting for pending results... 7491 1727203975.31110: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7491 1727203975.31204: in run() - task 0affcd87-79f5-0a4a-ad01-000000000028 7491 1727203975.31216: variable 'ansible_search_path' from source: unknown 7491 1727203975.31220: variable 'ansible_search_path' from source: unknown 7491 1727203975.31251: calling self._execute() 7491 1727203975.31334: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203975.31339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203975.31347: variable 'omit' from source: magic vars 7491 1727203975.31630: variable 'ansible_distribution_major_version' from source: facts 7491 1727203975.31640: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203975.31646: variable 'omit' from source: magic vars 7491 1727203975.31691: variable 'omit' from source: magic vars 7491 1727203975.31723: variable 'omit' from source: magic vars 7491 1727203975.31758: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203975.31788: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203975.31809: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203975.31825: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203975.31836: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203975.31859: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203975.31862: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203975.31868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203975.31943: Set connection var ansible_timeout to 10 7491 1727203975.31948: Set connection var ansible_pipelining to False 7491 1727203975.31954: Set connection var ansible_shell_type to sh 7491 1727203975.31959: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203975.31967: Set connection var ansible_shell_executable to /bin/sh 7491 1727203975.31972: Set connection var ansible_connection to ssh 7491 1727203975.31989: variable 'ansible_shell_executable' from source: unknown 7491 1727203975.31992: variable 'ansible_connection' from source: unknown 7491 1727203975.31995: variable 'ansible_module_compression' from source: unknown 7491 1727203975.31998: variable 'ansible_shell_type' from source: unknown 7491 1727203975.32000: variable 'ansible_shell_executable' from source: unknown 7491 1727203975.32003: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203975.32005: variable 'ansible_pipelining' from source: unknown 7491 1727203975.32008: variable 'ansible_timeout' from source: unknown 7491 1727203975.32012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203975.32113: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203975.32126: variable 'omit' from source: magic vars 7491 1727203975.32133: starting attempt loop 7491 1727203975.32136: running the handler 7491 1727203975.32235: variable '__network_connections_result' from source: set_fact 7491 1727203975.32279: handler run complete 7491 1727203975.32292: attempt loop complete, returning result 7491 1727203975.32295: _execute() done 7491 1727203975.32297: dumping result to json 7491 1727203975.32299: done dumping result, returning 7491 1727203975.32307: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-0a4a-ad01-000000000028] 7491 1727203975.32312: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000028 7491 1727203975.32402: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000028 7491 1727203975.32406: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, e01b0787-1873-4334-a8a8-27f8e63061d2", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, e01b0787-1873-4334-a8a8-27f8e63061d2 (not-active)" ] } 7491 1727203975.32471: no more pending results, returning what we have 7491 1727203975.32476: results queue empty 7491 1727203975.32477: checking for any_errors_fatal 7491 1727203975.32485: done checking for any_errors_fatal 7491 1727203975.32486: checking for max_fail_percentage 7491 1727203975.32488: done checking for max_fail_percentage 7491 1727203975.32489: checking to see if all hosts have failed and the running result is not ok 7491 1727203975.32490: done checking to see if all hosts have failed 7491 1727203975.32490: getting the remaining hosts for this loop 7491 1727203975.32492: done getting the remaining hosts for this loop 7491 1727203975.32496: getting the next task for host managed-node3 7491 1727203975.32502: done getting next task for host managed-node3 7491 1727203975.32506: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7491 1727203975.32509: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203975.32520: getting variables 7491 1727203975.32522: in VariableManager get_vars() 7491 1727203975.32568: Calling all_inventory to load vars for managed-node3 7491 1727203975.32571: Calling groups_inventory to load vars for managed-node3 7491 1727203975.32573: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203975.32588: Calling all_plugins_play to load vars for managed-node3 7491 1727203975.32590: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203975.32593: Calling groups_plugins_play to load vars for managed-node3 7491 1727203975.33392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203975.34311: done with get_vars() 7491 1727203975.34334: done getting variables 7491 1727203975.34382: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:52:55 -0400 (0:00:00.037) 0:00:17.267 ***** 7491 1727203975.34409: entering _queue_task() for managed-node3/debug 7491 1727203975.34642: worker is 1 (out of 1 available) 7491 1727203975.34656: exiting _queue_task() for managed-node3/debug 7491 1727203975.34671: done queuing things up, now waiting for results queue to drain 7491 1727203975.34672: waiting for pending results... 7491 1727203975.34866: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7491 1727203975.34959: in run() - task 0affcd87-79f5-0a4a-ad01-000000000029 7491 1727203975.34976: variable 'ansible_search_path' from source: unknown 7491 1727203975.34980: variable 'ansible_search_path' from source: unknown 7491 1727203975.35015: calling self._execute() 7491 1727203975.35099: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203975.35103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203975.35113: variable 'omit' from source: magic vars 7491 1727203975.35394: variable 'ansible_distribution_major_version' from source: facts 7491 1727203975.35405: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203975.35412: variable 'omit' from source: magic vars 7491 1727203975.35458: variable 'omit' from source: magic vars 7491 1727203975.35487: variable 'omit' from source: magic vars 7491 1727203975.35523: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203975.35552: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203975.35571: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203975.35585: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203975.35593: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203975.35618: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203975.35626: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203975.35628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203975.35703: Set connection var ansible_timeout to 10 7491 1727203975.35708: Set connection var ansible_pipelining to False 7491 1727203975.35713: Set connection var ansible_shell_type to sh 7491 1727203975.35721: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203975.35732: Set connection var ansible_shell_executable to /bin/sh 7491 1727203975.35736: Set connection var ansible_connection to ssh 7491 1727203975.35756: variable 'ansible_shell_executable' from source: unknown 7491 1727203975.35758: variable 'ansible_connection' from source: unknown 7491 1727203975.35761: variable 'ansible_module_compression' from source: unknown 7491 1727203975.35765: variable 'ansible_shell_type' from source: unknown 7491 1727203975.35767: variable 'ansible_shell_executable' from source: unknown 7491 1727203975.35770: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203975.35776: variable 'ansible_pipelining' from source: unknown 7491 1727203975.35778: variable 'ansible_timeout' from source: unknown 7491 1727203975.35783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203975.35884: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203975.35893: variable 'omit' from source: magic vars 7491 1727203975.35899: starting attempt loop 7491 1727203975.35902: running the handler 7491 1727203975.35941: variable '__network_connections_result' from source: set_fact 7491 1727203975.36000: variable '__network_connections_result' from source: set_fact 7491 1727203975.36103: handler run complete 7491 1727203975.36128: attempt loop complete, returning result 7491 1727203975.36131: _execute() done 7491 1727203975.36133: dumping result to json 7491 1727203975.36138: done dumping result, returning 7491 1727203975.36145: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-0a4a-ad01-000000000029] 7491 1727203975.36151: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000029 7491 1727203975.36243: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000029 7491 1727203975.36245: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/64", "203.0.113.2/24" ], "auto6": false, "auto_gateway": true, "dhcp4": false, "gateway4": "203.0.113.1", "gateway6": "2001:db8::1", "route_metric4": 65535 }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, e01b0787-1873-4334-a8a8-27f8e63061d2\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, e01b0787-1873-4334-a8a8-27f8e63061d2 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, e01b0787-1873-4334-a8a8-27f8e63061d2", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, e01b0787-1873-4334-a8a8-27f8e63061d2 (not-active)" ] } } 7491 1727203975.36339: no more pending results, returning what we have 7491 1727203975.36342: results queue empty 7491 1727203975.36344: checking for any_errors_fatal 7491 1727203975.36349: done checking for any_errors_fatal 7491 1727203975.36350: checking for max_fail_percentage 7491 1727203975.36351: done checking for max_fail_percentage 7491 1727203975.36352: checking to see if all hosts have failed and the running result is not ok 7491 1727203975.36353: done checking to see if all hosts have failed 7491 1727203975.36354: getting the remaining hosts for this loop 7491 1727203975.36356: done getting the remaining hosts for this loop 7491 1727203975.36359: getting the next task for host managed-node3 7491 1727203975.36366: done getting next task for host managed-node3 7491 1727203975.36370: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7491 1727203975.36373: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203975.36390: getting variables 7491 1727203975.36399: in VariableManager get_vars() 7491 1727203975.36441: Calling all_inventory to load vars for managed-node3 7491 1727203975.36444: Calling groups_inventory to load vars for managed-node3 7491 1727203975.36446: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203975.36454: Calling all_plugins_play to load vars for managed-node3 7491 1727203975.36456: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203975.36458: Calling groups_plugins_play to load vars for managed-node3 7491 1727203975.37344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203975.38257: done with get_vars() 7491 1727203975.38278: done getting variables 7491 1727203975.38322: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:52:55 -0400 (0:00:00.039) 0:00:17.307 ***** 7491 1727203975.38349: entering _queue_task() for managed-node3/debug 7491 1727203975.38573: worker is 1 (out of 1 available) 7491 1727203975.38585: exiting _queue_task() for managed-node3/debug 7491 1727203975.38598: done queuing things up, now waiting for results queue to drain 7491 1727203975.38600: waiting for pending results... 7491 1727203975.38791: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7491 1727203975.38889: in run() - task 0affcd87-79f5-0a4a-ad01-00000000002a 7491 1727203975.38900: variable 'ansible_search_path' from source: unknown 7491 1727203975.38904: variable 'ansible_search_path' from source: unknown 7491 1727203975.38934: calling self._execute() 7491 1727203975.39013: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203975.39018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203975.39025: variable 'omit' from source: magic vars 7491 1727203975.39304: variable 'ansible_distribution_major_version' from source: facts 7491 1727203975.39315: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203975.39402: variable 'network_state' from source: role '' defaults 7491 1727203975.39411: Evaluated conditional (network_state != {}): False 7491 1727203975.39418: when evaluation is False, skipping this task 7491 1727203975.39421: _execute() done 7491 1727203975.39423: dumping result to json 7491 1727203975.39426: done dumping result, returning 7491 1727203975.39430: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-0a4a-ad01-00000000002a] 7491 1727203975.39436: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000002a 7491 1727203975.39525: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000002a 7491 1727203975.39527: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "network_state != {}" } 7491 1727203975.39574: no more pending results, returning what we have 7491 1727203975.39577: results queue empty 7491 1727203975.39578: checking for any_errors_fatal 7491 1727203975.39588: done checking for any_errors_fatal 7491 1727203975.39589: checking for max_fail_percentage 7491 1727203975.39591: done checking for max_fail_percentage 7491 1727203975.39592: checking to see if all hosts have failed and the running result is not ok 7491 1727203975.39593: done checking to see if all hosts have failed 7491 1727203975.39594: getting the remaining hosts for this loop 7491 1727203975.39597: done getting the remaining hosts for this loop 7491 1727203975.39600: getting the next task for host managed-node3 7491 1727203975.39606: done getting next task for host managed-node3 7491 1727203975.39610: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 7491 1727203975.39613: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203975.39631: getting variables 7491 1727203975.39633: in VariableManager get_vars() 7491 1727203975.39678: Calling all_inventory to load vars for managed-node3 7491 1727203975.39681: Calling groups_inventory to load vars for managed-node3 7491 1727203975.39683: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203975.39692: Calling all_plugins_play to load vars for managed-node3 7491 1727203975.39694: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203975.39697: Calling groups_plugins_play to load vars for managed-node3 7491 1727203975.40493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203975.41501: done with get_vars() 7491 1727203975.41519: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:52:55 -0400 (0:00:00.032) 0:00:17.339 ***** 7491 1727203975.41590: entering _queue_task() for managed-node3/ping 7491 1727203975.41591: Creating lock for ping 7491 1727203975.41820: worker is 1 (out of 1 available) 7491 1727203975.41832: exiting _queue_task() for managed-node3/ping 7491 1727203975.41844: done queuing things up, now waiting for results queue to drain 7491 1727203975.41846: waiting for pending results... 7491 1727203975.42039: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 7491 1727203975.42129: in run() - task 0affcd87-79f5-0a4a-ad01-00000000002b 7491 1727203975.42140: variable 'ansible_search_path' from source: unknown 7491 1727203975.42143: variable 'ansible_search_path' from source: unknown 7491 1727203975.42178: calling self._execute() 7491 1727203975.42249: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203975.42253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203975.42264: variable 'omit' from source: magic vars 7491 1727203975.42544: variable 'ansible_distribution_major_version' from source: facts 7491 1727203975.42553: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203975.42560: variable 'omit' from source: magic vars 7491 1727203975.42608: variable 'omit' from source: magic vars 7491 1727203975.42636: variable 'omit' from source: magic vars 7491 1727203975.42670: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203975.42696: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203975.42721: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203975.42734: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203975.42744: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203975.42768: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203975.42772: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203975.42776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203975.42852: Set connection var ansible_timeout to 10 7491 1727203975.42857: Set connection var ansible_pipelining to False 7491 1727203975.42862: Set connection var ansible_shell_type to sh 7491 1727203975.42869: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203975.42876: Set connection var ansible_shell_executable to /bin/sh 7491 1727203975.42881: Set connection var ansible_connection to ssh 7491 1727203975.42897: variable 'ansible_shell_executable' from source: unknown 7491 1727203975.42900: variable 'ansible_connection' from source: unknown 7491 1727203975.42903: variable 'ansible_module_compression' from source: unknown 7491 1727203975.42905: variable 'ansible_shell_type' from source: unknown 7491 1727203975.42907: variable 'ansible_shell_executable' from source: unknown 7491 1727203975.42911: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203975.42913: variable 'ansible_pipelining' from source: unknown 7491 1727203975.42915: variable 'ansible_timeout' from source: unknown 7491 1727203975.42921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203975.43071: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7491 1727203975.43080: variable 'omit' from source: magic vars 7491 1727203975.43084: starting attempt loop 7491 1727203975.43087: running the handler 7491 1727203975.43098: _low_level_execute_command(): starting 7491 1727203975.43105: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203975.43651: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203975.43671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203975.43687: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203975.43708: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203975.43749: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203975.43762: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203975.43824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203975.45389: stdout chunk (state=3): >>>/root <<< 7491 1727203975.45484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203975.45547: stderr chunk (state=3): >>><<< 7491 1727203975.45551: stdout chunk (state=3): >>><<< 7491 1727203975.45574: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203975.45585: _low_level_execute_command(): starting 7491 1727203975.45591: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203975.4557457-8534-26641590372680 `" && echo ansible-tmp-1727203975.4557457-8534-26641590372680="` echo /root/.ansible/tmp/ansible-tmp-1727203975.4557457-8534-26641590372680 `" ) && sleep 0' 7491 1727203975.46076: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203975.46089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203975.46100: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203975.46112: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203975.46133: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203975.46176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203975.46186: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203975.46193: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203975.46245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203975.48047: stdout chunk (state=3): >>>ansible-tmp-1727203975.4557457-8534-26641590372680=/root/.ansible/tmp/ansible-tmp-1727203975.4557457-8534-26641590372680 <<< 7491 1727203975.48159: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203975.48217: stderr chunk (state=3): >>><<< 7491 1727203975.48227: stdout chunk (state=3): >>><<< 7491 1727203975.48246: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203975.4557457-8534-26641590372680=/root/.ansible/tmp/ansible-tmp-1727203975.4557457-8534-26641590372680 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203975.48288: variable 'ansible_module_compression' from source: unknown 7491 1727203975.48329: ANSIBALLZ: Using lock for ping 7491 1727203975.48332: ANSIBALLZ: Acquiring lock 7491 1727203975.48335: ANSIBALLZ: Lock acquired: 139674604380784 7491 1727203975.48337: ANSIBALLZ: Creating module 7491 1727203975.58834: ANSIBALLZ: Writing module into payload 7491 1727203975.58909: ANSIBALLZ: Writing module 7491 1727203975.58940: ANSIBALLZ: Renaming module 7491 1727203975.58952: ANSIBALLZ: Done creating module 7491 1727203975.58977: variable 'ansible_facts' from source: unknown 7491 1727203975.59036: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203975.4557457-8534-26641590372680/AnsiballZ_ping.py 7491 1727203975.59169: Sending initial data 7491 1727203975.59173: Sent initial data (150 bytes) 7491 1727203975.59857: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203975.59865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203975.59920: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203975.59924: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203975.59926: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 7491 1727203975.59928: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203975.59930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203975.59972: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203975.59985: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203975.60036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203975.61714: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 7491 1727203975.61729: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 7491 1727203975.61739: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 7491 1727203975.61749: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 7491 1727203975.61759: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 7491 1727203975.61780: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 7491 1727203975.61797: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203975.61841: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203975.61892: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmphg_qf6bd /root/.ansible/tmp/ansible-tmp-1727203975.4557457-8534-26641590372680/AnsiballZ_ping.py <<< 7491 1727203975.61923: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203975.63007: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203975.63268: stderr chunk (state=3): >>><<< 7491 1727203975.63273: stdout chunk (state=3): >>><<< 7491 1727203975.63275: done transferring module to remote 7491 1727203975.63277: _low_level_execute_command(): starting 7491 1727203975.63279: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203975.4557457-8534-26641590372680/ /root/.ansible/tmp/ansible-tmp-1727203975.4557457-8534-26641590372680/AnsiballZ_ping.py && sleep 0' 7491 1727203975.63930: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203975.63952: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203975.63968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203975.63986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203975.64031: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203975.64053: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203975.64069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203975.64088: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203975.64099: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203975.64108: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203975.64119: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203975.64136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203975.64150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203975.64173: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203975.64186: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203975.64198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203975.64285: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203975.64302: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203975.64316: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203975.64417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203975.66116: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203975.66201: stderr chunk (state=3): >>><<< 7491 1727203975.66204: stdout chunk (state=3): >>><<< 7491 1727203975.66307: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203975.66311: _low_level_execute_command(): starting 7491 1727203975.66315: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203975.4557457-8534-26641590372680/AnsiballZ_ping.py && sleep 0' 7491 1727203975.66945: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203975.66949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203975.66991: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203975.66994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203975.66997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203975.67072: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203975.67075: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203975.67080: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203975.67133: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203975.79988: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 7491 1727203975.80946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203975.81025: stderr chunk (state=3): >>><<< 7491 1727203975.81030: stdout chunk (state=3): >>><<< 7491 1727203975.81148: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203975.81153: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203975.4557457-8534-26641590372680/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203975.81156: _low_level_execute_command(): starting 7491 1727203975.81158: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203975.4557457-8534-26641590372680/ > /dev/null 2>&1 && sleep 0' 7491 1727203975.81753: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203975.81773: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203975.81789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203975.81808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203975.81857: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203975.81876: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203975.81892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203975.81910: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203975.81926: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203975.81937: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203975.81950: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203975.81971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203975.81988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203975.82002: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203975.82014: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203975.82032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203975.82113: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203975.82134: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203975.82150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203975.82234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203975.84088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203975.84141: stderr chunk (state=3): >>><<< 7491 1727203975.84145: stdout chunk (state=3): >>><<< 7491 1727203975.84371: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203975.84375: handler run complete 7491 1727203975.84377: attempt loop complete, returning result 7491 1727203975.84380: _execute() done 7491 1727203975.84382: dumping result to json 7491 1727203975.84384: done dumping result, returning 7491 1727203975.84386: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-0a4a-ad01-00000000002b] 7491 1727203975.84388: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000002b 7491 1727203975.84461: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000002b 7491 1727203975.84466: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 7491 1727203975.84538: no more pending results, returning what we have 7491 1727203975.84542: results queue empty 7491 1727203975.84544: checking for any_errors_fatal 7491 1727203975.84551: done checking for any_errors_fatal 7491 1727203975.84552: checking for max_fail_percentage 7491 1727203975.84554: done checking for max_fail_percentage 7491 1727203975.84555: checking to see if all hosts have failed and the running result is not ok 7491 1727203975.84557: done checking to see if all hosts have failed 7491 1727203975.84557: getting the remaining hosts for this loop 7491 1727203975.84559: done getting the remaining hosts for this loop 7491 1727203975.84569: getting the next task for host managed-node3 7491 1727203975.84581: done getting next task for host managed-node3 7491 1727203975.84584: ^ task is: TASK: meta (role_complete) 7491 1727203975.84587: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203975.84599: getting variables 7491 1727203975.84602: in VariableManager get_vars() 7491 1727203975.84660: Calling all_inventory to load vars for managed-node3 7491 1727203975.84665: Calling groups_inventory to load vars for managed-node3 7491 1727203975.84668: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203975.84679: Calling all_plugins_play to load vars for managed-node3 7491 1727203975.84682: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203975.84686: Calling groups_plugins_play to load vars for managed-node3 7491 1727203975.86778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203975.88462: done with get_vars() 7491 1727203975.88496: done getting variables 7491 1727203975.88742: done queuing things up, now waiting for results queue to drain 7491 1727203975.88745: results queue empty 7491 1727203975.88746: checking for any_errors_fatal 7491 1727203975.88749: done checking for any_errors_fatal 7491 1727203975.88750: checking for max_fail_percentage 7491 1727203975.88751: done checking for max_fail_percentage 7491 1727203975.88752: checking to see if all hosts have failed and the running result is not ok 7491 1727203975.88753: done checking to see if all hosts have failed 7491 1727203975.88753: getting the remaining hosts for this loop 7491 1727203975.88754: done getting the remaining hosts for this loop 7491 1727203975.88757: getting the next task for host managed-node3 7491 1727203975.88762: done getting next task for host managed-node3 7491 1727203975.88766: ^ task is: TASK: Include the task 'assert_device_present.yml' 7491 1727203975.88768: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203975.88771: getting variables 7491 1727203975.88772: in VariableManager get_vars() 7491 1727203975.88796: Calling all_inventory to load vars for managed-node3 7491 1727203975.88799: Calling groups_inventory to load vars for managed-node3 7491 1727203975.88801: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203975.88806: Calling all_plugins_play to load vars for managed-node3 7491 1727203975.88808: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203975.88811: Calling groups_plugins_play to load vars for managed-node3 7491 1727203975.90141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203975.92351: done with get_vars() 7491 1727203975.92388: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:42 Tuesday 24 September 2024 14:52:55 -0400 (0:00:00.508) 0:00:17.848 ***** 7491 1727203975.92477: entering _queue_task() for managed-node3/include_tasks 7491 1727203975.93038: worker is 1 (out of 1 available) 7491 1727203975.93061: exiting _queue_task() for managed-node3/include_tasks 7491 1727203975.93076: done queuing things up, now waiting for results queue to drain 7491 1727203975.93077: waiting for pending results... 7491 1727203975.93401: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_present.yml' 7491 1727203975.93542: in run() - task 0affcd87-79f5-0a4a-ad01-00000000005b 7491 1727203975.93566: variable 'ansible_search_path' from source: unknown 7491 1727203975.93619: calling self._execute() 7491 1727203975.93741: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203975.93753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203975.93769: variable 'omit' from source: magic vars 7491 1727203975.94220: variable 'ansible_distribution_major_version' from source: facts 7491 1727203975.94240: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203975.94258: _execute() done 7491 1727203975.94269: dumping result to json 7491 1727203975.94284: done dumping result, returning 7491 1727203975.94295: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_present.yml' [0affcd87-79f5-0a4a-ad01-00000000005b] 7491 1727203975.94308: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000005b 7491 1727203975.94463: no more pending results, returning what we have 7491 1727203975.94471: in VariableManager get_vars() 7491 1727203975.94543: Calling all_inventory to load vars for managed-node3 7491 1727203975.94546: Calling groups_inventory to load vars for managed-node3 7491 1727203975.94548: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203975.94563: Calling all_plugins_play to load vars for managed-node3 7491 1727203975.94567: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203975.94570: Calling groups_plugins_play to load vars for managed-node3 7491 1727203975.95623: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000005b 7491 1727203975.95628: WORKER PROCESS EXITING 7491 1727203975.96437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203975.98225: done with get_vars() 7491 1727203975.98258: variable 'ansible_search_path' from source: unknown 7491 1727203975.98278: we have included files to process 7491 1727203975.98279: generating all_blocks data 7491 1727203975.98282: done generating all_blocks data 7491 1727203975.98289: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7491 1727203975.98290: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7491 1727203975.98292: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7491 1727203975.98401: in VariableManager get_vars() 7491 1727203975.98431: done with get_vars() 7491 1727203975.98549: done processing included file 7491 1727203975.98551: iterating over new_blocks loaded from include file 7491 1727203975.98553: in VariableManager get_vars() 7491 1727203975.98583: done with get_vars() 7491 1727203975.98585: filtering new block on tags 7491 1727203975.98604: done filtering new block on tags 7491 1727203975.98606: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node3 7491 1727203975.98612: extending task lists for all hosts with included blocks 7491 1727203976.03356: done extending task lists 7491 1727203976.03358: done processing included files 7491 1727203976.03359: results queue empty 7491 1727203976.03359: checking for any_errors_fatal 7491 1727203976.03361: done checking for any_errors_fatal 7491 1727203976.03367: checking for max_fail_percentage 7491 1727203976.03368: done checking for max_fail_percentage 7491 1727203976.03369: checking to see if all hosts have failed and the running result is not ok 7491 1727203976.03370: done checking to see if all hosts have failed 7491 1727203976.03371: getting the remaining hosts for this loop 7491 1727203976.03372: done getting the remaining hosts for this loop 7491 1727203976.03377: getting the next task for host managed-node3 7491 1727203976.03381: done getting next task for host managed-node3 7491 1727203976.03383: ^ task is: TASK: Include the task 'get_interface_stat.yml' 7491 1727203976.03386: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203976.03388: getting variables 7491 1727203976.03390: in VariableManager get_vars() 7491 1727203976.03415: Calling all_inventory to load vars for managed-node3 7491 1727203976.03420: Calling groups_inventory to load vars for managed-node3 7491 1727203976.03423: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203976.03429: Calling all_plugins_play to load vars for managed-node3 7491 1727203976.03432: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203976.03435: Calling groups_plugins_play to load vars for managed-node3 7491 1727203976.04871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203976.06585: done with get_vars() 7491 1727203976.06623: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:52:56 -0400 (0:00:00.142) 0:00:17.991 ***** 7491 1727203976.06715: entering _queue_task() for managed-node3/include_tasks 7491 1727203976.07069: worker is 1 (out of 1 available) 7491 1727203976.07082: exiting _queue_task() for managed-node3/include_tasks 7491 1727203976.07095: done queuing things up, now waiting for results queue to drain 7491 1727203976.07097: waiting for pending results... 7491 1727203976.07411: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 7491 1727203976.07551: in run() - task 0affcd87-79f5-0a4a-ad01-0000000008c2 7491 1727203976.07574: variable 'ansible_search_path' from source: unknown 7491 1727203976.07582: variable 'ansible_search_path' from source: unknown 7491 1727203976.07631: calling self._execute() 7491 1727203976.07743: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203976.07759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203976.07777: variable 'omit' from source: magic vars 7491 1727203976.08185: variable 'ansible_distribution_major_version' from source: facts 7491 1727203976.08206: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203976.08221: _execute() done 7491 1727203976.08231: dumping result to json 7491 1727203976.08240: done dumping result, returning 7491 1727203976.08250: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-0a4a-ad01-0000000008c2] 7491 1727203976.08269: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000008c2 7491 1727203976.08392: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000008c2 7491 1727203976.08401: WORKER PROCESS EXITING 7491 1727203976.08437: no more pending results, returning what we have 7491 1727203976.08442: in VariableManager get_vars() 7491 1727203976.08510: Calling all_inventory to load vars for managed-node3 7491 1727203976.08514: Calling groups_inventory to load vars for managed-node3 7491 1727203976.08520: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203976.08536: Calling all_plugins_play to load vars for managed-node3 7491 1727203976.08539: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203976.08543: Calling groups_plugins_play to load vars for managed-node3 7491 1727203976.10439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203976.19334: done with get_vars() 7491 1727203976.19526: variable 'ansible_search_path' from source: unknown 7491 1727203976.19528: variable 'ansible_search_path' from source: unknown 7491 1727203976.19561: we have included files to process 7491 1727203976.19562: generating all_blocks data 7491 1727203976.19563: done generating all_blocks data 7491 1727203976.19648: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7491 1727203976.19650: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7491 1727203976.19653: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7491 1727203976.19941: done processing included file 7491 1727203976.19943: iterating over new_blocks loaded from include file 7491 1727203976.19945: in VariableManager get_vars() 7491 1727203976.19974: done with get_vars() 7491 1727203976.19976: filtering new block on tags 7491 1727203976.19991: done filtering new block on tags 7491 1727203976.19993: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node3 7491 1727203976.19998: extending task lists for all hosts with included blocks 7491 1727203976.20092: done extending task lists 7491 1727203976.20094: done processing included files 7491 1727203976.20095: results queue empty 7491 1727203976.20095: checking for any_errors_fatal 7491 1727203976.20098: done checking for any_errors_fatal 7491 1727203976.20099: checking for max_fail_percentage 7491 1727203976.20100: done checking for max_fail_percentage 7491 1727203976.20101: checking to see if all hosts have failed and the running result is not ok 7491 1727203976.20102: done checking to see if all hosts have failed 7491 1727203976.20103: getting the remaining hosts for this loop 7491 1727203976.20104: done getting the remaining hosts for this loop 7491 1727203976.20106: getting the next task for host managed-node3 7491 1727203976.20109: done getting next task for host managed-node3 7491 1727203976.20111: ^ task is: TASK: Get stat for interface {{ interface }} 7491 1727203976.20114: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203976.20119: getting variables 7491 1727203976.20120: in VariableManager get_vars() 7491 1727203976.20139: Calling all_inventory to load vars for managed-node3 7491 1727203976.20141: Calling groups_inventory to load vars for managed-node3 7491 1727203976.20143: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203976.20149: Calling all_plugins_play to load vars for managed-node3 7491 1727203976.20151: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203976.20154: Calling groups_plugins_play to load vars for managed-node3 7491 1727203976.21409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203976.25373: done with get_vars() 7491 1727203976.25406: done getting variables 7491 1727203976.25573: variable 'interface' from source: play vars TASK [Get stat for interface veth0] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:52:56 -0400 (0:00:00.188) 0:00:18.179 ***** 7491 1727203976.25603: entering _queue_task() for managed-node3/stat 7491 1727203976.25919: worker is 1 (out of 1 available) 7491 1727203976.25930: exiting _queue_task() for managed-node3/stat 7491 1727203976.25942: done queuing things up, now waiting for results queue to drain 7491 1727203976.25944: waiting for pending results... 7491 1727203976.27466: running TaskExecutor() for managed-node3/TASK: Get stat for interface veth0 7491 1727203976.27597: in run() - task 0affcd87-79f5-0a4a-ad01-000000000ac6 7491 1727203976.27624: variable 'ansible_search_path' from source: unknown 7491 1727203976.27632: variable 'ansible_search_path' from source: unknown 7491 1727203976.27679: calling self._execute() 7491 1727203976.27802: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203976.27818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203976.27836: variable 'omit' from source: magic vars 7491 1727203976.28241: variable 'ansible_distribution_major_version' from source: facts 7491 1727203976.28263: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203976.28276: variable 'omit' from source: magic vars 7491 1727203976.28327: variable 'omit' from source: magic vars 7491 1727203976.28434: variable 'interface' from source: play vars 7491 1727203976.28458: variable 'omit' from source: magic vars 7491 1727203976.28511: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203976.28553: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203976.28588: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203976.28610: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203976.28630: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203976.28668: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203976.28678: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203976.28690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203976.28806: Set connection var ansible_timeout to 10 7491 1727203976.28822: Set connection var ansible_pipelining to False 7491 1727203976.28834: Set connection var ansible_shell_type to sh 7491 1727203976.28844: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203976.28855: Set connection var ansible_shell_executable to /bin/sh 7491 1727203976.28866: Set connection var ansible_connection to ssh 7491 1727203976.28893: variable 'ansible_shell_executable' from source: unknown 7491 1727203976.28905: variable 'ansible_connection' from source: unknown 7491 1727203976.28913: variable 'ansible_module_compression' from source: unknown 7491 1727203976.28924: variable 'ansible_shell_type' from source: unknown 7491 1727203976.28933: variable 'ansible_shell_executable' from source: unknown 7491 1727203976.28940: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203976.28948: variable 'ansible_pipelining' from source: unknown 7491 1727203976.28955: variable 'ansible_timeout' from source: unknown 7491 1727203976.28962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203976.29173: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7491 1727203976.29191: variable 'omit' from source: magic vars 7491 1727203976.29204: starting attempt loop 7491 1727203976.29212: running the handler 7491 1727203976.29238: _low_level_execute_command(): starting 7491 1727203976.29251: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203976.30036: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203976.30052: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203976.30072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203976.30096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203976.30150: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203976.30163: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203976.30182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203976.30201: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203976.30219: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203976.30232: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203976.30245: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203976.30260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203976.30281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203976.30295: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203976.30307: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203976.30330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203976.30409: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203976.30441: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203976.30459: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203976.30545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203976.32167: stdout chunk (state=3): >>>/root <<< 7491 1727203976.32370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203976.32374: stdout chunk (state=3): >>><<< 7491 1727203976.32377: stderr chunk (state=3): >>><<< 7491 1727203976.32476: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203976.32480: _low_level_execute_command(): starting 7491 1727203976.32483: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203976.3240151-8560-179438483623077 `" && echo ansible-tmp-1727203976.3240151-8560-179438483623077="` echo /root/.ansible/tmp/ansible-tmp-1727203976.3240151-8560-179438483623077 `" ) && sleep 0' 7491 1727203976.35226: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203976.35231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203976.35260: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203976.35265: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203976.35270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203976.35442: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203976.35505: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203976.35710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203976.37451: stdout chunk (state=3): >>>ansible-tmp-1727203976.3240151-8560-179438483623077=/root/.ansible/tmp/ansible-tmp-1727203976.3240151-8560-179438483623077 <<< 7491 1727203976.37568: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203976.37650: stderr chunk (state=3): >>><<< 7491 1727203976.37653: stdout chunk (state=3): >>><<< 7491 1727203976.37683: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203976.3240151-8560-179438483623077=/root/.ansible/tmp/ansible-tmp-1727203976.3240151-8560-179438483623077 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203976.37734: variable 'ansible_module_compression' from source: unknown 7491 1727203976.37802: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 7491 1727203976.37844: variable 'ansible_facts' from source: unknown 7491 1727203976.37931: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203976.3240151-8560-179438483623077/AnsiballZ_stat.py 7491 1727203976.38817: Sending initial data 7491 1727203976.38821: Sent initial data (151 bytes) 7491 1727203976.39849: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203976.39854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203976.39900: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203976.39905: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203976.39927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203976.39997: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203976.40013: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203976.40018: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203976.40095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203976.41787: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203976.41834: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203976.41883: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmp5kurblyr /root/.ansible/tmp/ansible-tmp-1727203976.3240151-8560-179438483623077/AnsiballZ_stat.py <<< 7491 1727203976.41918: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203976.43114: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203976.43205: stderr chunk (state=3): >>><<< 7491 1727203976.43208: stdout chunk (state=3): >>><<< 7491 1727203976.43235: done transferring module to remote 7491 1727203976.43245: _low_level_execute_command(): starting 7491 1727203976.43250: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203976.3240151-8560-179438483623077/ /root/.ansible/tmp/ansible-tmp-1727203976.3240151-8560-179438483623077/AnsiballZ_stat.py && sleep 0' 7491 1727203976.43971: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203976.43974: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203976.43977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203976.43980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203976.43983: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203976.43985: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203976.44010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203976.44014: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203976.44017: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203976.44023: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203976.44116: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203976.44119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203976.44123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203976.44125: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203976.44127: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203976.44129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203976.44159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203976.44163: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203976.44172: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203976.44283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203976.46017: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203976.46021: stdout chunk (state=3): >>><<< 7491 1727203976.46032: stderr chunk (state=3): >>><<< 7491 1727203976.46050: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203976.46053: _low_level_execute_command(): starting 7491 1727203976.46058: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203976.3240151-8560-179438483623077/AnsiballZ_stat.py && sleep 0' 7491 1727203976.47209: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203976.47798: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203976.47900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203976.47913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203976.47959: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203976.47968: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203976.47978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203976.47991: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203976.47999: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203976.48005: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203976.48013: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203976.48025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203976.48036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203976.48046: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203976.48049: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203976.48058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203976.48175: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203976.48273: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203976.48282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203976.48671: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203976.61484: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 23327, "dev": 21, "nlink": 1, "atime": 1727203967.8445623, "mtime": 1727203967.8445623, "ctime": 1727203967.8445623, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 7491 1727203976.62537: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203976.62541: stdout chunk (state=3): >>><<< 7491 1727203976.62547: stderr chunk (state=3): >>><<< 7491 1727203976.62580: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 23327, "dev": 21, "nlink": 1, "atime": 1727203967.8445623, "mtime": 1727203967.8445623, "ctime": 1727203967.8445623, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203976.62639: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203976.3240151-8560-179438483623077/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203976.62648: _low_level_execute_command(): starting 7491 1727203976.62653: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203976.3240151-8560-179438483623077/ > /dev/null 2>&1 && sleep 0' 7491 1727203976.63365: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203976.63375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203976.63395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203976.63409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203976.63451: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203976.63458: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203976.63473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203976.63489: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203976.63504: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203976.63514: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203976.63526: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203976.63535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203976.63547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203976.63554: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203976.63562: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203976.63574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203976.63695: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203976.63724: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203976.63737: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203976.64371: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203976.65697: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203976.65701: stdout chunk (state=3): >>><<< 7491 1727203976.65710: stderr chunk (state=3): >>><<< 7491 1727203976.65731: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203976.65737: handler run complete 7491 1727203976.65792: attempt loop complete, returning result 7491 1727203976.65796: _execute() done 7491 1727203976.65798: dumping result to json 7491 1727203976.65804: done dumping result, returning 7491 1727203976.65813: done running TaskExecutor() for managed-node3/TASK: Get stat for interface veth0 [0affcd87-79f5-0a4a-ad01-000000000ac6] 7491 1727203976.65822: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000ac6 7491 1727203976.65940: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000ac6 7491 1727203976.65943: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "atime": 1727203967.8445623, "block_size": 4096, "blocks": 0, "ctime": 1727203967.8445623, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 23327, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "mode": "0777", "mtime": 1727203967.8445623, "nlink": 1, "path": "/sys/class/net/veth0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 7491 1727203976.66088: no more pending results, returning what we have 7491 1727203976.66092: results queue empty 7491 1727203976.66093: checking for any_errors_fatal 7491 1727203976.66095: done checking for any_errors_fatal 7491 1727203976.66096: checking for max_fail_percentage 7491 1727203976.66097: done checking for max_fail_percentage 7491 1727203976.66098: checking to see if all hosts have failed and the running result is not ok 7491 1727203976.66100: done checking to see if all hosts have failed 7491 1727203976.66100: getting the remaining hosts for this loop 7491 1727203976.66102: done getting the remaining hosts for this loop 7491 1727203976.66107: getting the next task for host managed-node3 7491 1727203976.66115: done getting next task for host managed-node3 7491 1727203976.66120: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 7491 1727203976.66123: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203976.66127: getting variables 7491 1727203976.66129: in VariableManager get_vars() 7491 1727203976.66182: Calling all_inventory to load vars for managed-node3 7491 1727203976.66190: Calling groups_inventory to load vars for managed-node3 7491 1727203976.66192: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203976.66204: Calling all_plugins_play to load vars for managed-node3 7491 1727203976.66207: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203976.66210: Calling groups_plugins_play to load vars for managed-node3 7491 1727203976.68615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203976.72323: done with get_vars() 7491 1727203976.72362: done getting variables 7491 1727203976.72427: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203976.72679: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'veth0'] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:52:56 -0400 (0:00:00.471) 0:00:18.651 ***** 7491 1727203976.72719: entering _queue_task() for managed-node3/assert 7491 1727203976.73071: worker is 1 (out of 1 available) 7491 1727203976.73083: exiting _queue_task() for managed-node3/assert 7491 1727203976.73096: done queuing things up, now waiting for results queue to drain 7491 1727203976.73097: waiting for pending results... 7491 1727203976.73431: running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'veth0' 7491 1727203976.73605: in run() - task 0affcd87-79f5-0a4a-ad01-0000000008c3 7491 1727203976.73626: variable 'ansible_search_path' from source: unknown 7491 1727203976.73633: variable 'ansible_search_path' from source: unknown 7491 1727203976.73678: calling self._execute() 7491 1727203976.73786: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203976.73799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203976.73808: variable 'omit' from source: magic vars 7491 1727203976.74210: variable 'ansible_distribution_major_version' from source: facts 7491 1727203976.74231: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203976.74238: variable 'omit' from source: magic vars 7491 1727203976.74282: variable 'omit' from source: magic vars 7491 1727203976.74398: variable 'interface' from source: play vars 7491 1727203976.74423: variable 'omit' from source: magic vars 7491 1727203976.74474: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203976.74509: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203976.74537: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203976.74561: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203976.74575: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203976.74606: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203976.74609: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203976.74612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203976.74735: Set connection var ansible_timeout to 10 7491 1727203976.74748: Set connection var ansible_pipelining to False 7491 1727203976.74754: Set connection var ansible_shell_type to sh 7491 1727203976.74760: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203976.74778: Set connection var ansible_shell_executable to /bin/sh 7491 1727203976.74783: Set connection var ansible_connection to ssh 7491 1727203976.74810: variable 'ansible_shell_executable' from source: unknown 7491 1727203976.74813: variable 'ansible_connection' from source: unknown 7491 1727203976.74816: variable 'ansible_module_compression' from source: unknown 7491 1727203976.74818: variable 'ansible_shell_type' from source: unknown 7491 1727203976.74824: variable 'ansible_shell_executable' from source: unknown 7491 1727203976.74826: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203976.74830: variable 'ansible_pipelining' from source: unknown 7491 1727203976.74833: variable 'ansible_timeout' from source: unknown 7491 1727203976.74837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203976.75007: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203976.75017: variable 'omit' from source: magic vars 7491 1727203976.75025: starting attempt loop 7491 1727203976.75029: running the handler 7491 1727203976.75285: variable 'interface_stat' from source: set_fact 7491 1727203976.75288: Evaluated conditional (interface_stat.stat.exists): True 7491 1727203976.75290: handler run complete 7491 1727203976.75292: attempt loop complete, returning result 7491 1727203976.75294: _execute() done 7491 1727203976.75296: dumping result to json 7491 1727203976.75298: done dumping result, returning 7491 1727203976.75300: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'veth0' [0affcd87-79f5-0a4a-ad01-0000000008c3] 7491 1727203976.75302: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000008c3 7491 1727203976.75421: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000008c3 7491 1727203976.75423: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 7491 1727203976.75485: no more pending results, returning what we have 7491 1727203976.75490: results queue empty 7491 1727203976.75491: checking for any_errors_fatal 7491 1727203976.75502: done checking for any_errors_fatal 7491 1727203976.75503: checking for max_fail_percentage 7491 1727203976.75505: done checking for max_fail_percentage 7491 1727203976.75506: checking to see if all hosts have failed and the running result is not ok 7491 1727203976.75507: done checking to see if all hosts have failed 7491 1727203976.75508: getting the remaining hosts for this loop 7491 1727203976.75513: done getting the remaining hosts for this loop 7491 1727203976.75520: getting the next task for host managed-node3 7491 1727203976.75528: done getting next task for host managed-node3 7491 1727203976.75532: ^ task is: TASK: Include the task 'assert_profile_present.yml' 7491 1727203976.75533: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203976.75539: getting variables 7491 1727203976.75541: in VariableManager get_vars() 7491 1727203976.75597: Calling all_inventory to load vars for managed-node3 7491 1727203976.75601: Calling groups_inventory to load vars for managed-node3 7491 1727203976.75603: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203976.75615: Calling all_plugins_play to load vars for managed-node3 7491 1727203976.75621: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203976.75624: Calling groups_plugins_play to load vars for managed-node3 7491 1727203976.77440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203976.79340: done with get_vars() 7491 1727203976.79373: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:44 Tuesday 24 September 2024 14:52:56 -0400 (0:00:00.067) 0:00:18.718 ***** 7491 1727203976.79498: entering _queue_task() for managed-node3/include_tasks 7491 1727203976.79846: worker is 1 (out of 1 available) 7491 1727203976.79860: exiting _queue_task() for managed-node3/include_tasks 7491 1727203976.79875: done queuing things up, now waiting for results queue to drain 7491 1727203976.79877: waiting for pending results... 7491 1727203976.80200: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_profile_present.yml' 7491 1727203976.80303: in run() - task 0affcd87-79f5-0a4a-ad01-00000000005c 7491 1727203976.80323: variable 'ansible_search_path' from source: unknown 7491 1727203976.80365: calling self._execute() 7491 1727203976.80478: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203976.80488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203976.80498: variable 'omit' from source: magic vars 7491 1727203976.81022: variable 'ansible_distribution_major_version' from source: facts 7491 1727203976.81039: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203976.81044: _execute() done 7491 1727203976.81047: dumping result to json 7491 1727203976.81051: done dumping result, returning 7491 1727203976.81057: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_profile_present.yml' [0affcd87-79f5-0a4a-ad01-00000000005c] 7491 1727203976.81066: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000005c 7491 1727203976.81179: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000005c 7491 1727203976.81183: WORKER PROCESS EXITING 7491 1727203976.81209: no more pending results, returning what we have 7491 1727203976.81214: in VariableManager get_vars() 7491 1727203976.81276: Calling all_inventory to load vars for managed-node3 7491 1727203976.81280: Calling groups_inventory to load vars for managed-node3 7491 1727203976.81282: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203976.81294: Calling all_plugins_play to load vars for managed-node3 7491 1727203976.81296: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203976.81298: Calling groups_plugins_play to load vars for managed-node3 7491 1727203976.82208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203976.83439: done with get_vars() 7491 1727203976.83466: variable 'ansible_search_path' from source: unknown 7491 1727203976.83482: we have included files to process 7491 1727203976.83483: generating all_blocks data 7491 1727203976.83485: done generating all_blocks data 7491 1727203976.83489: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 7491 1727203976.83490: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 7491 1727203976.83493: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 7491 1727203976.83702: in VariableManager get_vars() 7491 1727203976.83733: done with get_vars() 7491 1727203976.84011: done processing included file 7491 1727203976.84013: iterating over new_blocks loaded from include file 7491 1727203976.84015: in VariableManager get_vars() 7491 1727203976.84046: done with get_vars() 7491 1727203976.84048: filtering new block on tags 7491 1727203976.84070: done filtering new block on tags 7491 1727203976.84072: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node3 7491 1727203976.84078: extending task lists for all hosts with included blocks 7491 1727203976.87920: done extending task lists 7491 1727203976.87922: done processing included files 7491 1727203976.87922: results queue empty 7491 1727203976.87923: checking for any_errors_fatal 7491 1727203976.87926: done checking for any_errors_fatal 7491 1727203976.87926: checking for max_fail_percentage 7491 1727203976.87927: done checking for max_fail_percentage 7491 1727203976.87928: checking to see if all hosts have failed and the running result is not ok 7491 1727203976.87928: done checking to see if all hosts have failed 7491 1727203976.87929: getting the remaining hosts for this loop 7491 1727203976.87930: done getting the remaining hosts for this loop 7491 1727203976.87931: getting the next task for host managed-node3 7491 1727203976.87934: done getting next task for host managed-node3 7491 1727203976.87936: ^ task is: TASK: Include the task 'get_profile_stat.yml' 7491 1727203976.87937: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203976.87939: getting variables 7491 1727203976.87939: in VariableManager get_vars() 7491 1727203976.87956: Calling all_inventory to load vars for managed-node3 7491 1727203976.87959: Calling groups_inventory to load vars for managed-node3 7491 1727203976.87960: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203976.87968: Calling all_plugins_play to load vars for managed-node3 7491 1727203976.87970: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203976.87972: Calling groups_plugins_play to load vars for managed-node3 7491 1727203976.89250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203976.90273: done with get_vars() 7491 1727203976.90294: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:52:56 -0400 (0:00:00.108) 0:00:18.827 ***** 7491 1727203976.90355: entering _queue_task() for managed-node3/include_tasks 7491 1727203976.90596: worker is 1 (out of 1 available) 7491 1727203976.90611: exiting _queue_task() for managed-node3/include_tasks 7491 1727203976.90625: done queuing things up, now waiting for results queue to drain 7491 1727203976.90626: waiting for pending results... 7491 1727203976.90817: running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' 7491 1727203976.90897: in run() - task 0affcd87-79f5-0a4a-ad01-000000000ade 7491 1727203976.90908: variable 'ansible_search_path' from source: unknown 7491 1727203976.90912: variable 'ansible_search_path' from source: unknown 7491 1727203976.90946: calling self._execute() 7491 1727203976.91039: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203976.91042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203976.91052: variable 'omit' from source: magic vars 7491 1727203976.91342: variable 'ansible_distribution_major_version' from source: facts 7491 1727203976.91352: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203976.91358: _execute() done 7491 1727203976.91361: dumping result to json 7491 1727203976.91365: done dumping result, returning 7491 1727203976.91371: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' [0affcd87-79f5-0a4a-ad01-000000000ade] 7491 1727203976.91380: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000ade 7491 1727203976.91469: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000ade 7491 1727203976.91472: WORKER PROCESS EXITING 7491 1727203976.91506: no more pending results, returning what we have 7491 1727203976.91513: in VariableManager get_vars() 7491 1727203976.91573: Calling all_inventory to load vars for managed-node3 7491 1727203976.91577: Calling groups_inventory to load vars for managed-node3 7491 1727203976.91579: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203976.91597: Calling all_plugins_play to load vars for managed-node3 7491 1727203976.91600: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203976.91603: Calling groups_plugins_play to load vars for managed-node3 7491 1727203976.92432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203976.93848: done with get_vars() 7491 1727203976.93880: variable 'ansible_search_path' from source: unknown 7491 1727203976.93882: variable 'ansible_search_path' from source: unknown 7491 1727203976.93922: we have included files to process 7491 1727203976.93923: generating all_blocks data 7491 1727203976.93925: done generating all_blocks data 7491 1727203976.93927: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 7491 1727203976.93928: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 7491 1727203976.93930: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 7491 1727203976.95020: done processing included file 7491 1727203976.95022: iterating over new_blocks loaded from include file 7491 1727203976.95024: in VariableManager get_vars() 7491 1727203976.95051: done with get_vars() 7491 1727203976.95054: filtering new block on tags 7491 1727203976.95080: done filtering new block on tags 7491 1727203976.95083: in VariableManager get_vars() 7491 1727203976.95108: done with get_vars() 7491 1727203976.95110: filtering new block on tags 7491 1727203976.95132: done filtering new block on tags 7491 1727203976.95134: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node3 7491 1727203976.95140: extending task lists for all hosts with included blocks 7491 1727203976.95308: done extending task lists 7491 1727203976.95309: done processing included files 7491 1727203976.95310: results queue empty 7491 1727203976.95311: checking for any_errors_fatal 7491 1727203976.95314: done checking for any_errors_fatal 7491 1727203976.95315: checking for max_fail_percentage 7491 1727203976.95316: done checking for max_fail_percentage 7491 1727203976.95317: checking to see if all hosts have failed and the running result is not ok 7491 1727203976.95318: done checking to see if all hosts have failed 7491 1727203976.95318: getting the remaining hosts for this loop 7491 1727203976.95320: done getting the remaining hosts for this loop 7491 1727203976.95322: getting the next task for host managed-node3 7491 1727203976.95326: done getting next task for host managed-node3 7491 1727203976.95329: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 7491 1727203976.95332: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203976.95334: getting variables 7491 1727203976.95335: in VariableManager get_vars() 7491 1727203976.95532: Calling all_inventory to load vars for managed-node3 7491 1727203976.95535: Calling groups_inventory to load vars for managed-node3 7491 1727203976.95537: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203976.95543: Calling all_plugins_play to load vars for managed-node3 7491 1727203976.95546: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203976.95549: Calling groups_plugins_play to load vars for managed-node3 7491 1727203976.96375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203976.97280: done with get_vars() 7491 1727203976.97298: done getting variables 7491 1727203976.97332: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:52:56 -0400 (0:00:00.069) 0:00:18.897 ***** 7491 1727203976.97356: entering _queue_task() for managed-node3/set_fact 7491 1727203976.97592: worker is 1 (out of 1 available) 7491 1727203976.97606: exiting _queue_task() for managed-node3/set_fact 7491 1727203976.97621: done queuing things up, now waiting for results queue to drain 7491 1727203976.97622: waiting for pending results... 7491 1727203976.97821: running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag 7491 1727203976.97904: in run() - task 0affcd87-79f5-0a4a-ad01-000000000cef 7491 1727203976.97918: variable 'ansible_search_path' from source: unknown 7491 1727203976.97922: variable 'ansible_search_path' from source: unknown 7491 1727203976.97954: calling self._execute() 7491 1727203976.98040: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203976.98044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203976.98053: variable 'omit' from source: magic vars 7491 1727203976.98348: variable 'ansible_distribution_major_version' from source: facts 7491 1727203976.98358: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203976.98364: variable 'omit' from source: magic vars 7491 1727203976.98402: variable 'omit' from source: magic vars 7491 1727203976.98429: variable 'omit' from source: magic vars 7491 1727203976.98465: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203976.98494: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203976.98513: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203976.98530: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203976.98539: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203976.98562: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203976.98567: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203976.98570: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203976.98647: Set connection var ansible_timeout to 10 7491 1727203976.98650: Set connection var ansible_pipelining to False 7491 1727203976.98656: Set connection var ansible_shell_type to sh 7491 1727203976.98661: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203976.98670: Set connection var ansible_shell_executable to /bin/sh 7491 1727203976.98674: Set connection var ansible_connection to ssh 7491 1727203976.98693: variable 'ansible_shell_executable' from source: unknown 7491 1727203976.98698: variable 'ansible_connection' from source: unknown 7491 1727203976.98701: variable 'ansible_module_compression' from source: unknown 7491 1727203976.98703: variable 'ansible_shell_type' from source: unknown 7491 1727203976.98706: variable 'ansible_shell_executable' from source: unknown 7491 1727203976.98708: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203976.98710: variable 'ansible_pipelining' from source: unknown 7491 1727203976.98712: variable 'ansible_timeout' from source: unknown 7491 1727203976.98716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203976.98819: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203976.98830: variable 'omit' from source: magic vars 7491 1727203976.98837: starting attempt loop 7491 1727203976.98840: running the handler 7491 1727203976.98850: handler run complete 7491 1727203976.98858: attempt loop complete, returning result 7491 1727203976.98861: _execute() done 7491 1727203976.98866: dumping result to json 7491 1727203976.98868: done dumping result, returning 7491 1727203976.98874: done running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcd87-79f5-0a4a-ad01-000000000cef] 7491 1727203976.98881: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000cef 7491 1727203976.98963: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000cef 7491 1727203976.98968: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 7491 1727203976.99023: no more pending results, returning what we have 7491 1727203976.99027: results queue empty 7491 1727203976.99028: checking for any_errors_fatal 7491 1727203976.99030: done checking for any_errors_fatal 7491 1727203976.99030: checking for max_fail_percentage 7491 1727203976.99032: done checking for max_fail_percentage 7491 1727203976.99033: checking to see if all hosts have failed and the running result is not ok 7491 1727203976.99034: done checking to see if all hosts have failed 7491 1727203976.99035: getting the remaining hosts for this loop 7491 1727203976.99037: done getting the remaining hosts for this loop 7491 1727203976.99041: getting the next task for host managed-node3 7491 1727203976.99048: done getting next task for host managed-node3 7491 1727203976.99051: ^ task is: TASK: Stat profile file 7491 1727203976.99055: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203976.99058: getting variables 7491 1727203976.99060: in VariableManager get_vars() 7491 1727203976.99118: Calling all_inventory to load vars for managed-node3 7491 1727203976.99121: Calling groups_inventory to load vars for managed-node3 7491 1727203976.99123: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203976.99133: Calling all_plugins_play to load vars for managed-node3 7491 1727203976.99136: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203976.99139: Calling groups_plugins_play to load vars for managed-node3 7491 1727203977.00045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203977.00968: done with get_vars() 7491 1727203977.00987: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:52:57 -0400 (0:00:00.037) 0:00:18.934 ***** 7491 1727203977.01061: entering _queue_task() for managed-node3/stat 7491 1727203977.01293: worker is 1 (out of 1 available) 7491 1727203977.01309: exiting _queue_task() for managed-node3/stat 7491 1727203977.01323: done queuing things up, now waiting for results queue to drain 7491 1727203977.01324: waiting for pending results... 7491 1727203977.01513: running TaskExecutor() for managed-node3/TASK: Stat profile file 7491 1727203977.01590: in run() - task 0affcd87-79f5-0a4a-ad01-000000000cf0 7491 1727203977.01601: variable 'ansible_search_path' from source: unknown 7491 1727203977.01605: variable 'ansible_search_path' from source: unknown 7491 1727203977.01635: calling self._execute() 7491 1727203977.01716: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203977.01724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203977.01732: variable 'omit' from source: magic vars 7491 1727203977.02023: variable 'ansible_distribution_major_version' from source: facts 7491 1727203977.02034: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203977.02041: variable 'omit' from source: magic vars 7491 1727203977.02075: variable 'omit' from source: magic vars 7491 1727203977.02149: variable 'profile' from source: include params 7491 1727203977.02153: variable 'interface' from source: play vars 7491 1727203977.02207: variable 'interface' from source: play vars 7491 1727203977.02224: variable 'omit' from source: magic vars 7491 1727203977.02261: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203977.02289: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203977.02309: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203977.02324: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203977.02334: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203977.02359: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203977.02362: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203977.02366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203977.02438: Set connection var ansible_timeout to 10 7491 1727203977.02443: Set connection var ansible_pipelining to False 7491 1727203977.02450: Set connection var ansible_shell_type to sh 7491 1727203977.02454: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203977.02464: Set connection var ansible_shell_executable to /bin/sh 7491 1727203977.02467: Set connection var ansible_connection to ssh 7491 1727203977.02486: variable 'ansible_shell_executable' from source: unknown 7491 1727203977.02489: variable 'ansible_connection' from source: unknown 7491 1727203977.02491: variable 'ansible_module_compression' from source: unknown 7491 1727203977.02494: variable 'ansible_shell_type' from source: unknown 7491 1727203977.02496: variable 'ansible_shell_executable' from source: unknown 7491 1727203977.02499: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203977.02503: variable 'ansible_pipelining' from source: unknown 7491 1727203977.02506: variable 'ansible_timeout' from source: unknown 7491 1727203977.02509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203977.02665: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7491 1727203977.02676: variable 'omit' from source: magic vars 7491 1727203977.02679: starting attempt loop 7491 1727203977.02682: running the handler 7491 1727203977.02695: _low_level_execute_command(): starting 7491 1727203977.02702: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203977.03240: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203977.03250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203977.03280: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203977.03294: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203977.03305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203977.03354: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203977.03368: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203977.03430: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203977.05057: stdout chunk (state=3): >>>/root <<< 7491 1727203977.05154: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203977.05220: stderr chunk (state=3): >>><<< 7491 1727203977.05230: stdout chunk (state=3): >>><<< 7491 1727203977.05253: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203977.05272: _low_level_execute_command(): starting 7491 1727203977.05278: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203977.0525835-8593-6404837328003 `" && echo ansible-tmp-1727203977.0525835-8593-6404837328003="` echo /root/.ansible/tmp/ansible-tmp-1727203977.0525835-8593-6404837328003 `" ) && sleep 0' 7491 1727203977.05763: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203977.05772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203977.05801: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203977.05826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203977.05877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203977.05884: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203977.05891: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203977.05952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203977.07766: stdout chunk (state=3): >>>ansible-tmp-1727203977.0525835-8593-6404837328003=/root/.ansible/tmp/ansible-tmp-1727203977.0525835-8593-6404837328003 <<< 7491 1727203977.07873: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203977.07936: stderr chunk (state=3): >>><<< 7491 1727203977.07940: stdout chunk (state=3): >>><<< 7491 1727203977.07960: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203977.0525835-8593-6404837328003=/root/.ansible/tmp/ansible-tmp-1727203977.0525835-8593-6404837328003 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203977.08004: variable 'ansible_module_compression' from source: unknown 7491 1727203977.08057: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 7491 1727203977.08087: variable 'ansible_facts' from source: unknown 7491 1727203977.08152: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203977.0525835-8593-6404837328003/AnsiballZ_stat.py 7491 1727203977.08260: Sending initial data 7491 1727203977.08263: Sent initial data (149 bytes) 7491 1727203977.08993: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203977.08999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203977.09030: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203977.09045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203977.09055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203977.09106: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203977.09112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203977.09122: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203977.09187: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203977.10860: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203977.10893: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203977.10933: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpd77qhr7s /root/.ansible/tmp/ansible-tmp-1727203977.0525835-8593-6404837328003/AnsiballZ_stat.py <<< 7491 1727203977.10971: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203977.11767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203977.11886: stderr chunk (state=3): >>><<< 7491 1727203977.11890: stdout chunk (state=3): >>><<< 7491 1727203977.11907: done transferring module to remote 7491 1727203977.11919: _low_level_execute_command(): starting 7491 1727203977.11922: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203977.0525835-8593-6404837328003/ /root/.ansible/tmp/ansible-tmp-1727203977.0525835-8593-6404837328003/AnsiballZ_stat.py && sleep 0' 7491 1727203977.12410: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203977.12427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203977.12445: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203977.12458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203977.12470: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203977.12525: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203977.12531: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203977.12586: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203977.14257: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203977.14317: stderr chunk (state=3): >>><<< 7491 1727203977.14325: stdout chunk (state=3): >>><<< 7491 1727203977.14342: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203977.14345: _low_level_execute_command(): starting 7491 1727203977.14351: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203977.0525835-8593-6404837328003/AnsiballZ_stat.py && sleep 0' 7491 1727203977.14833: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203977.14837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203977.14877: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203977.14891: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203977.14948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203977.14957: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203977.15022: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203977.27987: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 7491 1727203977.28966: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203977.29026: stderr chunk (state=3): >>><<< 7491 1727203977.29031: stdout chunk (state=3): >>><<< 7491 1727203977.29047: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203977.29078: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203977.0525835-8593-6404837328003/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203977.29086: _low_level_execute_command(): starting 7491 1727203977.29091: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203977.0525835-8593-6404837328003/ > /dev/null 2>&1 && sleep 0' 7491 1727203977.29576: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203977.29596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203977.29620: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203977.29632: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203977.29678: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203977.29689: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203977.29744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203977.31494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203977.31554: stderr chunk (state=3): >>><<< 7491 1727203977.31557: stdout chunk (state=3): >>><<< 7491 1727203977.31575: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203977.31585: handler run complete 7491 1727203977.31600: attempt loop complete, returning result 7491 1727203977.31603: _execute() done 7491 1727203977.31605: dumping result to json 7491 1727203977.31609: done dumping result, returning 7491 1727203977.31616: done running TaskExecutor() for managed-node3/TASK: Stat profile file [0affcd87-79f5-0a4a-ad01-000000000cf0] 7491 1727203977.31628: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000cf0 7491 1727203977.31719: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000cf0 7491 1727203977.31722: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 7491 1727203977.31785: no more pending results, returning what we have 7491 1727203977.31789: results queue empty 7491 1727203977.31790: checking for any_errors_fatal 7491 1727203977.31797: done checking for any_errors_fatal 7491 1727203977.31797: checking for max_fail_percentage 7491 1727203977.31799: done checking for max_fail_percentage 7491 1727203977.31800: checking to see if all hosts have failed and the running result is not ok 7491 1727203977.31801: done checking to see if all hosts have failed 7491 1727203977.31802: getting the remaining hosts for this loop 7491 1727203977.31804: done getting the remaining hosts for this loop 7491 1727203977.31808: getting the next task for host managed-node3 7491 1727203977.31814: done getting next task for host managed-node3 7491 1727203977.31817: ^ task is: TASK: Set NM profile exist flag based on the profile files 7491 1727203977.31820: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203977.31824: getting variables 7491 1727203977.31826: in VariableManager get_vars() 7491 1727203977.31876: Calling all_inventory to load vars for managed-node3 7491 1727203977.31880: Calling groups_inventory to load vars for managed-node3 7491 1727203977.31882: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203977.31892: Calling all_plugins_play to load vars for managed-node3 7491 1727203977.31894: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203977.31897: Calling groups_plugins_play to load vars for managed-node3 7491 1727203977.32701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203977.33617: done with get_vars() 7491 1727203977.33635: done getting variables 7491 1727203977.33683: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:52:57 -0400 (0:00:00.326) 0:00:19.260 ***** 7491 1727203977.33708: entering _queue_task() for managed-node3/set_fact 7491 1727203977.33935: worker is 1 (out of 1 available) 7491 1727203977.33948: exiting _queue_task() for managed-node3/set_fact 7491 1727203977.33961: done queuing things up, now waiting for results queue to drain 7491 1727203977.33963: waiting for pending results... 7491 1727203977.34151: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files 7491 1727203977.34224: in run() - task 0affcd87-79f5-0a4a-ad01-000000000cf1 7491 1727203977.34240: variable 'ansible_search_path' from source: unknown 7491 1727203977.34244: variable 'ansible_search_path' from source: unknown 7491 1727203977.34272: calling self._execute() 7491 1727203977.34355: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203977.34359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203977.34370: variable 'omit' from source: magic vars 7491 1727203977.34645: variable 'ansible_distribution_major_version' from source: facts 7491 1727203977.34655: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203977.34746: variable 'profile_stat' from source: set_fact 7491 1727203977.34758: Evaluated conditional (profile_stat.stat.exists): False 7491 1727203977.34761: when evaluation is False, skipping this task 7491 1727203977.34765: _execute() done 7491 1727203977.34768: dumping result to json 7491 1727203977.34771: done dumping result, returning 7491 1727203977.34781: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files [0affcd87-79f5-0a4a-ad01-000000000cf1] 7491 1727203977.34784: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000cf1 7491 1727203977.34872: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000cf1 7491 1727203977.34875: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7491 1727203977.34929: no more pending results, returning what we have 7491 1727203977.34933: results queue empty 7491 1727203977.34934: checking for any_errors_fatal 7491 1727203977.34943: done checking for any_errors_fatal 7491 1727203977.34944: checking for max_fail_percentage 7491 1727203977.34946: done checking for max_fail_percentage 7491 1727203977.34947: checking to see if all hosts have failed and the running result is not ok 7491 1727203977.34948: done checking to see if all hosts have failed 7491 1727203977.34949: getting the remaining hosts for this loop 7491 1727203977.34951: done getting the remaining hosts for this loop 7491 1727203977.34954: getting the next task for host managed-node3 7491 1727203977.34961: done getting next task for host managed-node3 7491 1727203977.34965: ^ task is: TASK: Get NM profile info 7491 1727203977.34969: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203977.34972: getting variables 7491 1727203977.34974: in VariableManager get_vars() 7491 1727203977.35025: Calling all_inventory to load vars for managed-node3 7491 1727203977.35029: Calling groups_inventory to load vars for managed-node3 7491 1727203977.35031: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203977.35040: Calling all_plugins_play to load vars for managed-node3 7491 1727203977.35042: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203977.35045: Calling groups_plugins_play to load vars for managed-node3 7491 1727203977.35948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203977.37038: done with get_vars() 7491 1727203977.37057: done getting variables 7491 1727203977.37132: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:52:57 -0400 (0:00:00.034) 0:00:19.295 ***** 7491 1727203977.37156: entering _queue_task() for managed-node3/shell 7491 1727203977.37157: Creating lock for shell 7491 1727203977.37398: worker is 1 (out of 1 available) 7491 1727203977.37415: exiting _queue_task() for managed-node3/shell 7491 1727203977.37428: done queuing things up, now waiting for results queue to drain 7491 1727203977.37429: waiting for pending results... 7491 1727203977.37610: running TaskExecutor() for managed-node3/TASK: Get NM profile info 7491 1727203977.37689: in run() - task 0affcd87-79f5-0a4a-ad01-000000000cf2 7491 1727203977.37705: variable 'ansible_search_path' from source: unknown 7491 1727203977.37709: variable 'ansible_search_path' from source: unknown 7491 1727203977.37739: calling self._execute() 7491 1727203977.37822: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203977.37827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203977.37834: variable 'omit' from source: magic vars 7491 1727203977.38203: variable 'ansible_distribution_major_version' from source: facts 7491 1727203977.38206: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203977.38210: variable 'omit' from source: magic vars 7491 1727203977.38473: variable 'omit' from source: magic vars 7491 1727203977.38477: variable 'profile' from source: include params 7491 1727203977.38480: variable 'interface' from source: play vars 7491 1727203977.38483: variable 'interface' from source: play vars 7491 1727203977.38485: variable 'omit' from source: magic vars 7491 1727203977.38487: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203977.38514: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203977.38536: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203977.38551: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203977.38562: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203977.38593: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203977.38596: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203977.38599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203977.38693: Set connection var ansible_timeout to 10 7491 1727203977.38700: Set connection var ansible_pipelining to False 7491 1727203977.38705: Set connection var ansible_shell_type to sh 7491 1727203977.38711: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203977.38721: Set connection var ansible_shell_executable to /bin/sh 7491 1727203977.38724: Set connection var ansible_connection to ssh 7491 1727203977.38746: variable 'ansible_shell_executable' from source: unknown 7491 1727203977.38748: variable 'ansible_connection' from source: unknown 7491 1727203977.38751: variable 'ansible_module_compression' from source: unknown 7491 1727203977.38753: variable 'ansible_shell_type' from source: unknown 7491 1727203977.38756: variable 'ansible_shell_executable' from source: unknown 7491 1727203977.38758: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203977.38763: variable 'ansible_pipelining' from source: unknown 7491 1727203977.38767: variable 'ansible_timeout' from source: unknown 7491 1727203977.38802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203977.38902: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203977.38913: variable 'omit' from source: magic vars 7491 1727203977.38919: starting attempt loop 7491 1727203977.38923: running the handler 7491 1727203977.38930: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203977.38949: _low_level_execute_command(): starting 7491 1727203977.38957: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203977.39689: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203977.39701: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203977.39712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203977.39730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203977.39770: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203977.39778: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203977.39788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203977.39800: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203977.39808: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203977.39819: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203977.39824: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203977.39833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203977.39844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203977.39851: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203977.39858: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203977.39872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203977.39942: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203977.39962: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203977.39976: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203977.40046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203977.41569: stdout chunk (state=3): >>>/root <<< 7491 1727203977.41674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203977.41728: stderr chunk (state=3): >>><<< 7491 1727203977.41732: stdout chunk (state=3): >>><<< 7491 1727203977.41753: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203977.41769: _low_level_execute_command(): starting 7491 1727203977.41775: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203977.4175408-8602-88051628336594 `" && echo ansible-tmp-1727203977.4175408-8602-88051628336594="` echo /root/.ansible/tmp/ansible-tmp-1727203977.4175408-8602-88051628336594 `" ) && sleep 0' 7491 1727203977.42246: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203977.42255: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203977.42268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203977.42277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203977.42308: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203977.42315: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203977.42324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203977.42335: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203977.42342: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203977.42348: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203977.42355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203977.42367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203977.42372: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203977.42424: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203977.42446: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203977.42448: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203977.42507: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203977.44299: stdout chunk (state=3): >>>ansible-tmp-1727203977.4175408-8602-88051628336594=/root/.ansible/tmp/ansible-tmp-1727203977.4175408-8602-88051628336594 <<< 7491 1727203977.44409: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203977.44508: stderr chunk (state=3): >>><<< 7491 1727203977.44523: stdout chunk (state=3): >>><<< 7491 1727203977.44671: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203977.4175408-8602-88051628336594=/root/.ansible/tmp/ansible-tmp-1727203977.4175408-8602-88051628336594 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203977.44674: variable 'ansible_module_compression' from source: unknown 7491 1727203977.44677: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7491 1727203977.44793: variable 'ansible_facts' from source: unknown 7491 1727203977.44803: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203977.4175408-8602-88051628336594/AnsiballZ_command.py 7491 1727203977.44973: Sending initial data 7491 1727203977.44977: Sent initial data (153 bytes) 7491 1727203977.46059: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203977.46076: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203977.46090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203977.46120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203977.46163: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203977.46179: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203977.46192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203977.46213: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203977.46230: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203977.46241: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203977.46252: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203977.46266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203977.46281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203977.46292: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203977.46301: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203977.46324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203977.46406: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203977.46435: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203977.46454: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203977.46530: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203977.48220: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203977.48252: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203977.48297: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpek9nvlv7 /root/.ansible/tmp/ansible-tmp-1727203977.4175408-8602-88051628336594/AnsiballZ_command.py <<< 7491 1727203977.48341: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203977.49467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203977.49680: stderr chunk (state=3): >>><<< 7491 1727203977.49683: stdout chunk (state=3): >>><<< 7491 1727203977.49685: done transferring module to remote 7491 1727203977.49688: _low_level_execute_command(): starting 7491 1727203977.49690: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203977.4175408-8602-88051628336594/ /root/.ansible/tmp/ansible-tmp-1727203977.4175408-8602-88051628336594/AnsiballZ_command.py && sleep 0' 7491 1727203977.50293: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203977.50307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203977.50326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203977.50351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203977.50397: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203977.50408: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203977.50422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203977.50445: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203977.50455: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203977.50466: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203977.50476: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203977.50489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203977.50506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203977.50525: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203977.50538: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203977.50561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203977.50635: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203977.50657: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203977.50683: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203977.50753: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203977.52557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203977.52627: stderr chunk (state=3): >>><<< 7491 1727203977.52632: stdout chunk (state=3): >>><<< 7491 1727203977.52659: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203977.52665: _low_level_execute_command(): starting 7491 1727203977.52668: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203977.4175408-8602-88051628336594/AnsiballZ_command.py && sleep 0' 7491 1727203977.53300: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203977.53308: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203977.53370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203977.53374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203977.53377: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203977.53380: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203977.53385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203977.53398: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203977.53406: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203977.53412: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203977.53423: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203977.53428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203977.53440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203977.53446: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203977.53453: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203977.53462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203977.53539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203977.53554: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203977.53557: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203977.53639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203977.69054: stdout chunk (state=3): >>> {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-24 14:52:57.665155", "end": "2024-09-24 14:52:57.689616", "delta": "0:00:00.024461", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7491 1727203977.70304: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203977.70338: stderr chunk (state=3): >>><<< 7491 1727203977.70341: stdout chunk (state=3): >>><<< 7491 1727203977.70373: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-24 14:52:57.665155", "end": "2024-09-24 14:52:57.689616", "delta": "0:00:00.024461", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203977.70511: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203977.4175408-8602-88051628336594/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203977.70515: _low_level_execute_command(): starting 7491 1727203977.70521: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203977.4175408-8602-88051628336594/ > /dev/null 2>&1 && sleep 0' 7491 1727203977.71110: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203977.71129: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203977.71145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203977.71167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203977.71211: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203977.71228: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203977.71243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203977.71261: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203977.71280: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203977.71291: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203977.71304: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203977.71321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203977.71338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203977.71351: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203977.71363: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203977.71380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203977.71459: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203977.71477: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203977.71493: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203977.71587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203977.73323: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203977.73409: stderr chunk (state=3): >>><<< 7491 1727203977.73425: stdout chunk (state=3): >>><<< 7491 1727203977.73447: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203977.73455: handler run complete 7491 1727203977.73483: Evaluated conditional (False): False 7491 1727203977.73493: attempt loop complete, returning result 7491 1727203977.73496: _execute() done 7491 1727203977.73499: dumping result to json 7491 1727203977.73504: done dumping result, returning 7491 1727203977.73512: done running TaskExecutor() for managed-node3/TASK: Get NM profile info [0affcd87-79f5-0a4a-ad01-000000000cf2] 7491 1727203977.73521: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000cf2 7491 1727203977.73627: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000cf2 7491 1727203977.73630: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "delta": "0:00:00.024461", "end": "2024-09-24 14:52:57.689616", "rc": 0, "start": "2024-09-24 14:52:57.665155" } STDOUT: veth0 /etc/NetworkManager/system-connections/veth0.nmconnection 7491 1727203977.73753: no more pending results, returning what we have 7491 1727203977.73756: results queue empty 7491 1727203977.73757: checking for any_errors_fatal 7491 1727203977.73766: done checking for any_errors_fatal 7491 1727203977.73767: checking for max_fail_percentage 7491 1727203977.73769: done checking for max_fail_percentage 7491 1727203977.73770: checking to see if all hosts have failed and the running result is not ok 7491 1727203977.73771: done checking to see if all hosts have failed 7491 1727203977.73772: getting the remaining hosts for this loop 7491 1727203977.73774: done getting the remaining hosts for this loop 7491 1727203977.73778: getting the next task for host managed-node3 7491 1727203977.73787: done getting next task for host managed-node3 7491 1727203977.73790: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 7491 1727203977.73794: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203977.73798: getting variables 7491 1727203977.73800: in VariableManager get_vars() 7491 1727203977.73854: Calling all_inventory to load vars for managed-node3 7491 1727203977.73857: Calling groups_inventory to load vars for managed-node3 7491 1727203977.73859: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203977.73873: Calling all_plugins_play to load vars for managed-node3 7491 1727203977.73876: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203977.73879: Calling groups_plugins_play to load vars for managed-node3 7491 1727203977.75549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203977.77256: done with get_vars() 7491 1727203977.77288: done getting variables 7491 1727203977.77344: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:52:57 -0400 (0:00:00.402) 0:00:19.697 ***** 7491 1727203977.77385: entering _queue_task() for managed-node3/set_fact 7491 1727203977.77694: worker is 1 (out of 1 available) 7491 1727203977.77706: exiting _queue_task() for managed-node3/set_fact 7491 1727203977.77720: done queuing things up, now waiting for results queue to drain 7491 1727203977.77721: waiting for pending results... 7491 1727203977.78002: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 7491 1727203977.78110: in run() - task 0affcd87-79f5-0a4a-ad01-000000000cf3 7491 1727203977.78130: variable 'ansible_search_path' from source: unknown 7491 1727203977.78134: variable 'ansible_search_path' from source: unknown 7491 1727203977.78172: calling self._execute() 7491 1727203977.78279: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203977.78284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203977.78295: variable 'omit' from source: magic vars 7491 1727203977.78688: variable 'ansible_distribution_major_version' from source: facts 7491 1727203977.78700: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203977.78844: variable 'nm_profile_exists' from source: set_fact 7491 1727203977.78859: Evaluated conditional (nm_profile_exists.rc == 0): True 7491 1727203977.78871: variable 'omit' from source: magic vars 7491 1727203977.78922: variable 'omit' from source: magic vars 7491 1727203977.78960: variable 'omit' from source: magic vars 7491 1727203977.79006: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203977.79043: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203977.79066: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203977.79084: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203977.79101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203977.79131: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203977.79134: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203977.79137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203977.79248: Set connection var ansible_timeout to 10 7491 1727203977.79257: Set connection var ansible_pipelining to False 7491 1727203977.79265: Set connection var ansible_shell_type to sh 7491 1727203977.79272: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203977.79281: Set connection var ansible_shell_executable to /bin/sh 7491 1727203977.79286: Set connection var ansible_connection to ssh 7491 1727203977.79313: variable 'ansible_shell_executable' from source: unknown 7491 1727203977.79318: variable 'ansible_connection' from source: unknown 7491 1727203977.79321: variable 'ansible_module_compression' from source: unknown 7491 1727203977.79323: variable 'ansible_shell_type' from source: unknown 7491 1727203977.79326: variable 'ansible_shell_executable' from source: unknown 7491 1727203977.79328: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203977.79330: variable 'ansible_pipelining' from source: unknown 7491 1727203977.79332: variable 'ansible_timeout' from source: unknown 7491 1727203977.79334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203977.79484: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203977.79495: variable 'omit' from source: magic vars 7491 1727203977.79500: starting attempt loop 7491 1727203977.79503: running the handler 7491 1727203977.79519: handler run complete 7491 1727203977.79531: attempt loop complete, returning result 7491 1727203977.79534: _execute() done 7491 1727203977.79536: dumping result to json 7491 1727203977.79538: done dumping result, returning 7491 1727203977.79547: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcd87-79f5-0a4a-ad01-000000000cf3] 7491 1727203977.79552: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000cf3 7491 1727203977.79647: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000cf3 7491 1727203977.79650: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 7491 1727203977.79749: no more pending results, returning what we have 7491 1727203977.79752: results queue empty 7491 1727203977.79754: checking for any_errors_fatal 7491 1727203977.79767: done checking for any_errors_fatal 7491 1727203977.79768: checking for max_fail_percentage 7491 1727203977.79770: done checking for max_fail_percentage 7491 1727203977.79771: checking to see if all hosts have failed and the running result is not ok 7491 1727203977.79772: done checking to see if all hosts have failed 7491 1727203977.79773: getting the remaining hosts for this loop 7491 1727203977.79776: done getting the remaining hosts for this loop 7491 1727203977.79781: getting the next task for host managed-node3 7491 1727203977.79791: done getting next task for host managed-node3 7491 1727203977.79793: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 7491 1727203977.79796: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203977.79800: getting variables 7491 1727203977.79802: in VariableManager get_vars() 7491 1727203977.79856: Calling all_inventory to load vars for managed-node3 7491 1727203977.79860: Calling groups_inventory to load vars for managed-node3 7491 1727203977.79862: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203977.79877: Calling all_plugins_play to load vars for managed-node3 7491 1727203977.79880: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203977.79884: Calling groups_plugins_play to load vars for managed-node3 7491 1727203977.81652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203977.83400: done with get_vars() 7491 1727203977.83429: done getting variables 7491 1727203977.83498: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203977.83629: variable 'profile' from source: include params 7491 1727203977.83634: variable 'interface' from source: play vars 7491 1727203977.83705: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-veth0] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:52:57 -0400 (0:00:00.063) 0:00:19.761 ***** 7491 1727203977.83744: entering _queue_task() for managed-node3/command 7491 1727203977.84066: worker is 1 (out of 1 available) 7491 1727203977.84079: exiting _queue_task() for managed-node3/command 7491 1727203977.84097: done queuing things up, now waiting for results queue to drain 7491 1727203977.84099: waiting for pending results... 7491 1727203977.84389: running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-veth0 7491 1727203977.84503: in run() - task 0affcd87-79f5-0a4a-ad01-000000000cf5 7491 1727203977.84515: variable 'ansible_search_path' from source: unknown 7491 1727203977.84522: variable 'ansible_search_path' from source: unknown 7491 1727203977.84565: calling self._execute() 7491 1727203977.84678: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203977.84682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203977.84693: variable 'omit' from source: magic vars 7491 1727203977.85095: variable 'ansible_distribution_major_version' from source: facts 7491 1727203977.85108: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203977.85238: variable 'profile_stat' from source: set_fact 7491 1727203977.85250: Evaluated conditional (profile_stat.stat.exists): False 7491 1727203977.85254: when evaluation is False, skipping this task 7491 1727203977.85256: _execute() done 7491 1727203977.85259: dumping result to json 7491 1727203977.85263: done dumping result, returning 7491 1727203977.85271: done running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-veth0 [0affcd87-79f5-0a4a-ad01-000000000cf5] 7491 1727203977.85278: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000cf5 7491 1727203977.85374: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000cf5 7491 1727203977.85376: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7491 1727203977.85459: no more pending results, returning what we have 7491 1727203977.85466: results queue empty 7491 1727203977.85468: checking for any_errors_fatal 7491 1727203977.85477: done checking for any_errors_fatal 7491 1727203977.85477: checking for max_fail_percentage 7491 1727203977.85479: done checking for max_fail_percentage 7491 1727203977.85481: checking to see if all hosts have failed and the running result is not ok 7491 1727203977.85482: done checking to see if all hosts have failed 7491 1727203977.85483: getting the remaining hosts for this loop 7491 1727203977.85485: done getting the remaining hosts for this loop 7491 1727203977.85490: getting the next task for host managed-node3 7491 1727203977.85498: done getting next task for host managed-node3 7491 1727203977.85500: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 7491 1727203977.85504: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203977.85511: getting variables 7491 1727203977.85513: in VariableManager get_vars() 7491 1727203977.85574: Calling all_inventory to load vars for managed-node3 7491 1727203977.85577: Calling groups_inventory to load vars for managed-node3 7491 1727203977.85580: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203977.85596: Calling all_plugins_play to load vars for managed-node3 7491 1727203977.85599: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203977.85602: Calling groups_plugins_play to load vars for managed-node3 7491 1727203977.87269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203977.89104: done with get_vars() 7491 1727203977.89127: done getting variables 7491 1727203977.89198: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203977.89313: variable 'profile' from source: include params 7491 1727203977.89317: variable 'interface' from source: play vars 7491 1727203977.89384: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-veth0] *********************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:52:57 -0400 (0:00:00.056) 0:00:19.818 ***** 7491 1727203977.89416: entering _queue_task() for managed-node3/set_fact 7491 1727203977.89740: worker is 1 (out of 1 available) 7491 1727203977.89753: exiting _queue_task() for managed-node3/set_fact 7491 1727203977.89769: done queuing things up, now waiting for results queue to drain 7491 1727203977.89771: waiting for pending results... 7491 1727203977.90068: running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-veth0 7491 1727203977.90178: in run() - task 0affcd87-79f5-0a4a-ad01-000000000cf6 7491 1727203977.90191: variable 'ansible_search_path' from source: unknown 7491 1727203977.90196: variable 'ansible_search_path' from source: unknown 7491 1727203977.90237: calling self._execute() 7491 1727203977.90341: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203977.90347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203977.90359: variable 'omit' from source: magic vars 7491 1727203977.90732: variable 'ansible_distribution_major_version' from source: facts 7491 1727203977.90743: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203977.90883: variable 'profile_stat' from source: set_fact 7491 1727203977.90897: Evaluated conditional (profile_stat.stat.exists): False 7491 1727203977.90900: when evaluation is False, skipping this task 7491 1727203977.90903: _execute() done 7491 1727203977.90906: dumping result to json 7491 1727203977.90909: done dumping result, returning 7491 1727203977.90919: done running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-veth0 [0affcd87-79f5-0a4a-ad01-000000000cf6] 7491 1727203977.90922: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000cf6 7491 1727203977.91022: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000cf6 7491 1727203977.91026: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7491 1727203977.91076: no more pending results, returning what we have 7491 1727203977.91081: results queue empty 7491 1727203977.91083: checking for any_errors_fatal 7491 1727203977.91091: done checking for any_errors_fatal 7491 1727203977.91092: checking for max_fail_percentage 7491 1727203977.91094: done checking for max_fail_percentage 7491 1727203977.91095: checking to see if all hosts have failed and the running result is not ok 7491 1727203977.91096: done checking to see if all hosts have failed 7491 1727203977.91097: getting the remaining hosts for this loop 7491 1727203977.91099: done getting the remaining hosts for this loop 7491 1727203977.91103: getting the next task for host managed-node3 7491 1727203977.91112: done getting next task for host managed-node3 7491 1727203977.91115: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 7491 1727203977.91118: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203977.91123: getting variables 7491 1727203977.91125: in VariableManager get_vars() 7491 1727203977.91183: Calling all_inventory to load vars for managed-node3 7491 1727203977.91186: Calling groups_inventory to load vars for managed-node3 7491 1727203977.91189: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203977.91204: Calling all_plugins_play to load vars for managed-node3 7491 1727203977.91207: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203977.91211: Calling groups_plugins_play to load vars for managed-node3 7491 1727203977.92836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203977.94539: done with get_vars() 7491 1727203977.94573: done getting variables 7491 1727203977.94641: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203977.94767: variable 'profile' from source: include params 7491 1727203977.94771: variable 'interface' from source: play vars 7491 1727203977.94838: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-veth0] ****************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:52:57 -0400 (0:00:00.054) 0:00:19.872 ***** 7491 1727203977.94873: entering _queue_task() for managed-node3/command 7491 1727203977.95186: worker is 1 (out of 1 available) 7491 1727203977.95199: exiting _queue_task() for managed-node3/command 7491 1727203977.95213: done queuing things up, now waiting for results queue to drain 7491 1727203977.95214: waiting for pending results... 7491 1727203977.95506: running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-veth0 7491 1727203977.95621: in run() - task 0affcd87-79f5-0a4a-ad01-000000000cf7 7491 1727203977.95633: variable 'ansible_search_path' from source: unknown 7491 1727203977.95637: variable 'ansible_search_path' from source: unknown 7491 1727203977.95676: calling self._execute() 7491 1727203977.95779: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203977.95784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203977.95795: variable 'omit' from source: magic vars 7491 1727203977.96169: variable 'ansible_distribution_major_version' from source: facts 7491 1727203977.96181: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203977.96319: variable 'profile_stat' from source: set_fact 7491 1727203977.96330: Evaluated conditional (profile_stat.stat.exists): False 7491 1727203977.96333: when evaluation is False, skipping this task 7491 1727203977.96336: _execute() done 7491 1727203977.96339: dumping result to json 7491 1727203977.96341: done dumping result, returning 7491 1727203977.96348: done running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-veth0 [0affcd87-79f5-0a4a-ad01-000000000cf7] 7491 1727203977.96359: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000cf7 7491 1727203977.96453: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000cf7 7491 1727203977.96456: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7491 1727203977.96511: no more pending results, returning what we have 7491 1727203977.96515: results queue empty 7491 1727203977.96516: checking for any_errors_fatal 7491 1727203977.96524: done checking for any_errors_fatal 7491 1727203977.96525: checking for max_fail_percentage 7491 1727203977.96527: done checking for max_fail_percentage 7491 1727203977.96528: checking to see if all hosts have failed and the running result is not ok 7491 1727203977.96529: done checking to see if all hosts have failed 7491 1727203977.96530: getting the remaining hosts for this loop 7491 1727203977.96532: done getting the remaining hosts for this loop 7491 1727203977.96537: getting the next task for host managed-node3 7491 1727203977.96544: done getting next task for host managed-node3 7491 1727203977.96546: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 7491 1727203977.96551: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203977.96556: getting variables 7491 1727203977.96558: in VariableManager get_vars() 7491 1727203977.96617: Calling all_inventory to load vars for managed-node3 7491 1727203977.96620: Calling groups_inventory to load vars for managed-node3 7491 1727203977.96623: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203977.96637: Calling all_plugins_play to load vars for managed-node3 7491 1727203977.96641: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203977.96644: Calling groups_plugins_play to load vars for managed-node3 7491 1727203977.98839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203978.01274: done with get_vars() 7491 1727203978.01306: done getting variables 7491 1727203978.01375: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203978.01502: variable 'profile' from source: include params 7491 1727203978.01507: variable 'interface' from source: play vars 7491 1727203978.01578: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-veth0] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:52:58 -0400 (0:00:00.067) 0:00:19.939 ***** 7491 1727203978.01610: entering _queue_task() for managed-node3/set_fact 7491 1727203978.01920: worker is 1 (out of 1 available) 7491 1727203978.01933: exiting _queue_task() for managed-node3/set_fact 7491 1727203978.01947: done queuing things up, now waiting for results queue to drain 7491 1727203978.01948: waiting for pending results... 7491 1727203978.02249: running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-veth0 7491 1727203978.02358: in run() - task 0affcd87-79f5-0a4a-ad01-000000000cf8 7491 1727203978.02373: variable 'ansible_search_path' from source: unknown 7491 1727203978.02377: variable 'ansible_search_path' from source: unknown 7491 1727203978.02419: calling self._execute() 7491 1727203978.02526: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203978.02536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203978.02547: variable 'omit' from source: magic vars 7491 1727203978.02923: variable 'ansible_distribution_major_version' from source: facts 7491 1727203978.02938: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203978.03068: variable 'profile_stat' from source: set_fact 7491 1727203978.03084: Evaluated conditional (profile_stat.stat.exists): False 7491 1727203978.03087: when evaluation is False, skipping this task 7491 1727203978.03092: _execute() done 7491 1727203978.03094: dumping result to json 7491 1727203978.03097: done dumping result, returning 7491 1727203978.03101: done running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-veth0 [0affcd87-79f5-0a4a-ad01-000000000cf8] 7491 1727203978.03113: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000cf8 skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7491 1727203978.03249: no more pending results, returning what we have 7491 1727203978.03254: results queue empty 7491 1727203978.03255: checking for any_errors_fatal 7491 1727203978.03262: done checking for any_errors_fatal 7491 1727203978.03265: checking for max_fail_percentage 7491 1727203978.03267: done checking for max_fail_percentage 7491 1727203978.03269: checking to see if all hosts have failed and the running result is not ok 7491 1727203978.03270: done checking to see if all hosts have failed 7491 1727203978.03271: getting the remaining hosts for this loop 7491 1727203978.03273: done getting the remaining hosts for this loop 7491 1727203978.03277: getting the next task for host managed-node3 7491 1727203978.03289: done getting next task for host managed-node3 7491 1727203978.03292: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 7491 1727203978.03295: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203978.03299: getting variables 7491 1727203978.03301: in VariableManager get_vars() 7491 1727203978.03359: Calling all_inventory to load vars for managed-node3 7491 1727203978.03362: Calling groups_inventory to load vars for managed-node3 7491 1727203978.03367: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203978.03382: Calling all_plugins_play to load vars for managed-node3 7491 1727203978.03386: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203978.03389: Calling groups_plugins_play to load vars for managed-node3 7491 1727203978.04000: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000cf8 7491 1727203978.04003: WORKER PROCESS EXITING 7491 1727203978.05036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203978.06727: done with get_vars() 7491 1727203978.06752: done getting variables 7491 1727203978.06817: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203978.06941: variable 'profile' from source: include params 7491 1727203978.06945: variable 'interface' from source: play vars 7491 1727203978.07008: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'veth0'] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:52:58 -0400 (0:00:00.054) 0:00:19.994 ***** 7491 1727203978.07044: entering _queue_task() for managed-node3/assert 7491 1727203978.07338: worker is 1 (out of 1 available) 7491 1727203978.07355: exiting _queue_task() for managed-node3/assert 7491 1727203978.07370: done queuing things up, now waiting for results queue to drain 7491 1727203978.07371: waiting for pending results... 7491 1727203978.07667: running TaskExecutor() for managed-node3/TASK: Assert that the profile is present - 'veth0' 7491 1727203978.07768: in run() - task 0affcd87-79f5-0a4a-ad01-000000000adf 7491 1727203978.07782: variable 'ansible_search_path' from source: unknown 7491 1727203978.07786: variable 'ansible_search_path' from source: unknown 7491 1727203978.07829: calling self._execute() 7491 1727203978.07934: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203978.07938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203978.07947: variable 'omit' from source: magic vars 7491 1727203978.08338: variable 'ansible_distribution_major_version' from source: facts 7491 1727203978.08350: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203978.08363: variable 'omit' from source: magic vars 7491 1727203978.08401: variable 'omit' from source: magic vars 7491 1727203978.08508: variable 'profile' from source: include params 7491 1727203978.08512: variable 'interface' from source: play vars 7491 1727203978.08582: variable 'interface' from source: play vars 7491 1727203978.08600: variable 'omit' from source: magic vars 7491 1727203978.08640: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203978.08677: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203978.08705: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203978.08721: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203978.08731: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203978.08762: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203978.08771: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203978.08777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203978.08892: Set connection var ansible_timeout to 10 7491 1727203978.08898: Set connection var ansible_pipelining to False 7491 1727203978.08910: Set connection var ansible_shell_type to sh 7491 1727203978.08918: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203978.08923: Set connection var ansible_shell_executable to /bin/sh 7491 1727203978.08928: Set connection var ansible_connection to ssh 7491 1727203978.08955: variable 'ansible_shell_executable' from source: unknown 7491 1727203978.08958: variable 'ansible_connection' from source: unknown 7491 1727203978.08961: variable 'ansible_module_compression' from source: unknown 7491 1727203978.08963: variable 'ansible_shell_type' from source: unknown 7491 1727203978.08968: variable 'ansible_shell_executable' from source: unknown 7491 1727203978.08970: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203978.08972: variable 'ansible_pipelining' from source: unknown 7491 1727203978.08975: variable 'ansible_timeout' from source: unknown 7491 1727203978.08982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203978.09141: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203978.09152: variable 'omit' from source: magic vars 7491 1727203978.09157: starting attempt loop 7491 1727203978.09160: running the handler 7491 1727203978.09284: variable 'lsr_net_profile_exists' from source: set_fact 7491 1727203978.09289: Evaluated conditional (lsr_net_profile_exists): True 7491 1727203978.09295: handler run complete 7491 1727203978.09319: attempt loop complete, returning result 7491 1727203978.09322: _execute() done 7491 1727203978.09325: dumping result to json 7491 1727203978.09327: done dumping result, returning 7491 1727203978.09330: done running TaskExecutor() for managed-node3/TASK: Assert that the profile is present - 'veth0' [0affcd87-79f5-0a4a-ad01-000000000adf] 7491 1727203978.09341: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000adf 7491 1727203978.09433: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000adf 7491 1727203978.09436: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 7491 1727203978.09495: no more pending results, returning what we have 7491 1727203978.09500: results queue empty 7491 1727203978.09501: checking for any_errors_fatal 7491 1727203978.09507: done checking for any_errors_fatal 7491 1727203978.09508: checking for max_fail_percentage 7491 1727203978.09510: done checking for max_fail_percentage 7491 1727203978.09511: checking to see if all hosts have failed and the running result is not ok 7491 1727203978.09513: done checking to see if all hosts have failed 7491 1727203978.09514: getting the remaining hosts for this loop 7491 1727203978.09516: done getting the remaining hosts for this loop 7491 1727203978.09522: getting the next task for host managed-node3 7491 1727203978.09530: done getting next task for host managed-node3 7491 1727203978.09533: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 7491 1727203978.09536: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203978.09541: getting variables 7491 1727203978.09543: in VariableManager get_vars() 7491 1727203978.09606: Calling all_inventory to load vars for managed-node3 7491 1727203978.09609: Calling groups_inventory to load vars for managed-node3 7491 1727203978.09612: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203978.09625: Calling all_plugins_play to load vars for managed-node3 7491 1727203978.09628: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203978.09632: Calling groups_plugins_play to load vars for managed-node3 7491 1727203978.11468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203978.13171: done with get_vars() 7491 1727203978.13199: done getting variables 7491 1727203978.13269: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203978.13390: variable 'profile' from source: include params 7491 1727203978.13394: variable 'interface' from source: play vars 7491 1727203978.13460: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'veth0'] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:52:58 -0400 (0:00:00.064) 0:00:20.058 ***** 7491 1727203978.13503: entering _queue_task() for managed-node3/assert 7491 1727203978.13825: worker is 1 (out of 1 available) 7491 1727203978.13837: exiting _queue_task() for managed-node3/assert 7491 1727203978.13851: done queuing things up, now waiting for results queue to drain 7491 1727203978.13852: waiting for pending results... 7491 1727203978.14153: running TaskExecutor() for managed-node3/TASK: Assert that the ansible managed comment is present in 'veth0' 7491 1727203978.14251: in run() - task 0affcd87-79f5-0a4a-ad01-000000000ae0 7491 1727203978.14268: variable 'ansible_search_path' from source: unknown 7491 1727203978.14272: variable 'ansible_search_path' from source: unknown 7491 1727203978.14314: calling self._execute() 7491 1727203978.14416: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203978.14428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203978.14437: variable 'omit' from source: magic vars 7491 1727203978.14823: variable 'ansible_distribution_major_version' from source: facts 7491 1727203978.14834: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203978.14847: variable 'omit' from source: magic vars 7491 1727203978.14894: variable 'omit' from source: magic vars 7491 1727203978.15000: variable 'profile' from source: include params 7491 1727203978.15004: variable 'interface' from source: play vars 7491 1727203978.15074: variable 'interface' from source: play vars 7491 1727203978.15094: variable 'omit' from source: magic vars 7491 1727203978.15137: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203978.15176: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203978.15203: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203978.15222: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203978.15232: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203978.15263: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203978.15269: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203978.15277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203978.15386: Set connection var ansible_timeout to 10 7491 1727203978.15405: Set connection var ansible_pipelining to False 7491 1727203978.15410: Set connection var ansible_shell_type to sh 7491 1727203978.15413: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203978.15415: Set connection var ansible_shell_executable to /bin/sh 7491 1727203978.15421: Set connection var ansible_connection to ssh 7491 1727203978.15443: variable 'ansible_shell_executable' from source: unknown 7491 1727203978.15446: variable 'ansible_connection' from source: unknown 7491 1727203978.15449: variable 'ansible_module_compression' from source: unknown 7491 1727203978.15451: variable 'ansible_shell_type' from source: unknown 7491 1727203978.15453: variable 'ansible_shell_executable' from source: unknown 7491 1727203978.15455: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203978.15459: variable 'ansible_pipelining' from source: unknown 7491 1727203978.15462: variable 'ansible_timeout' from source: unknown 7491 1727203978.15467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203978.15618: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203978.15626: variable 'omit' from source: magic vars 7491 1727203978.15632: starting attempt loop 7491 1727203978.15635: running the handler 7491 1727203978.15750: variable 'lsr_net_profile_ansible_managed' from source: set_fact 7491 1727203978.15753: Evaluated conditional (lsr_net_profile_ansible_managed): True 7491 1727203978.15760: handler run complete 7491 1727203978.15778: attempt loop complete, returning result 7491 1727203978.15781: _execute() done 7491 1727203978.15784: dumping result to json 7491 1727203978.15786: done dumping result, returning 7491 1727203978.15793: done running TaskExecutor() for managed-node3/TASK: Assert that the ansible managed comment is present in 'veth0' [0affcd87-79f5-0a4a-ad01-000000000ae0] 7491 1727203978.15803: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000ae0 7491 1727203978.15897: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000ae0 7491 1727203978.15900: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 7491 1727203978.15977: no more pending results, returning what we have 7491 1727203978.15981: results queue empty 7491 1727203978.15982: checking for any_errors_fatal 7491 1727203978.15989: done checking for any_errors_fatal 7491 1727203978.15990: checking for max_fail_percentage 7491 1727203978.15992: done checking for max_fail_percentage 7491 1727203978.15993: checking to see if all hosts have failed and the running result is not ok 7491 1727203978.15996: done checking to see if all hosts have failed 7491 1727203978.15996: getting the remaining hosts for this loop 7491 1727203978.15999: done getting the remaining hosts for this loop 7491 1727203978.16003: getting the next task for host managed-node3 7491 1727203978.16009: done getting next task for host managed-node3 7491 1727203978.16012: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 7491 1727203978.16015: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203978.16019: getting variables 7491 1727203978.16021: in VariableManager get_vars() 7491 1727203978.16078: Calling all_inventory to load vars for managed-node3 7491 1727203978.16082: Calling groups_inventory to load vars for managed-node3 7491 1727203978.16084: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203978.16097: Calling all_plugins_play to load vars for managed-node3 7491 1727203978.16101: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203978.16104: Calling groups_plugins_play to load vars for managed-node3 7491 1727203978.17739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203978.19618: done with get_vars() 7491 1727203978.19647: done getting variables 7491 1727203978.19710: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203978.19835: variable 'profile' from source: include params 7491 1727203978.19840: variable 'interface' from source: play vars 7491 1727203978.19906: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in veth0] ***************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:52:58 -0400 (0:00:00.064) 0:00:20.123 ***** 7491 1727203978.19944: entering _queue_task() for managed-node3/assert 7491 1727203978.20260: worker is 1 (out of 1 available) 7491 1727203978.20274: exiting _queue_task() for managed-node3/assert 7491 1727203978.20292: done queuing things up, now waiting for results queue to drain 7491 1727203978.20294: waiting for pending results... 7491 1727203978.20590: running TaskExecutor() for managed-node3/TASK: Assert that the fingerprint comment is present in veth0 7491 1727203978.20680: in run() - task 0affcd87-79f5-0a4a-ad01-000000000ae1 7491 1727203978.20694: variable 'ansible_search_path' from source: unknown 7491 1727203978.20699: variable 'ansible_search_path' from source: unknown 7491 1727203978.20746: calling self._execute() 7491 1727203978.20867: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203978.20871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203978.20879: variable 'omit' from source: magic vars 7491 1727203978.21288: variable 'ansible_distribution_major_version' from source: facts 7491 1727203978.21299: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203978.21307: variable 'omit' from source: magic vars 7491 1727203978.21344: variable 'omit' from source: magic vars 7491 1727203978.21453: variable 'profile' from source: include params 7491 1727203978.21456: variable 'interface' from source: play vars 7491 1727203978.21526: variable 'interface' from source: play vars 7491 1727203978.21545: variable 'omit' from source: magic vars 7491 1727203978.21591: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203978.21636: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203978.21661: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203978.21680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203978.21691: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203978.21732: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203978.21735: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203978.21738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203978.21851: Set connection var ansible_timeout to 10 7491 1727203978.21858: Set connection var ansible_pipelining to False 7491 1727203978.21865: Set connection var ansible_shell_type to sh 7491 1727203978.21872: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203978.21881: Set connection var ansible_shell_executable to /bin/sh 7491 1727203978.21886: Set connection var ansible_connection to ssh 7491 1727203978.21909: variable 'ansible_shell_executable' from source: unknown 7491 1727203978.21912: variable 'ansible_connection' from source: unknown 7491 1727203978.21914: variable 'ansible_module_compression' from source: unknown 7491 1727203978.21919: variable 'ansible_shell_type' from source: unknown 7491 1727203978.21921: variable 'ansible_shell_executable' from source: unknown 7491 1727203978.21925: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203978.21931: variable 'ansible_pipelining' from source: unknown 7491 1727203978.21940: variable 'ansible_timeout' from source: unknown 7491 1727203978.21944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203978.22100: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203978.22111: variable 'omit' from source: magic vars 7491 1727203978.22118: starting attempt loop 7491 1727203978.22122: running the handler 7491 1727203978.22231: variable 'lsr_net_profile_fingerprint' from source: set_fact 7491 1727203978.22234: Evaluated conditional (lsr_net_profile_fingerprint): True 7491 1727203978.22243: handler run complete 7491 1727203978.22268: attempt loop complete, returning result 7491 1727203978.22271: _execute() done 7491 1727203978.22274: dumping result to json 7491 1727203978.22278: done dumping result, returning 7491 1727203978.22286: done running TaskExecutor() for managed-node3/TASK: Assert that the fingerprint comment is present in veth0 [0affcd87-79f5-0a4a-ad01-000000000ae1] 7491 1727203978.22292: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000ae1 ok: [managed-node3] => { "changed": false } MSG: All assertions passed 7491 1727203978.22431: no more pending results, returning what we have 7491 1727203978.22435: results queue empty 7491 1727203978.22437: checking for any_errors_fatal 7491 1727203978.22444: done checking for any_errors_fatal 7491 1727203978.22445: checking for max_fail_percentage 7491 1727203978.22447: done checking for max_fail_percentage 7491 1727203978.22448: checking to see if all hosts have failed and the running result is not ok 7491 1727203978.22449: done checking to see if all hosts have failed 7491 1727203978.22450: getting the remaining hosts for this loop 7491 1727203978.22452: done getting the remaining hosts for this loop 7491 1727203978.22456: getting the next task for host managed-node3 7491 1727203978.22467: done getting next task for host managed-node3 7491 1727203978.22470: ^ task is: TASK: Show ipv4 routes 7491 1727203978.22473: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203978.22477: getting variables 7491 1727203978.22480: in VariableManager get_vars() 7491 1727203978.22539: Calling all_inventory to load vars for managed-node3 7491 1727203978.22543: Calling groups_inventory to load vars for managed-node3 7491 1727203978.22545: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203978.22560: Calling all_plugins_play to load vars for managed-node3 7491 1727203978.22563: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203978.22569: Calling groups_plugins_play to load vars for managed-node3 7491 1727203978.28199: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000ae1 7491 1727203978.28204: WORKER PROCESS EXITING 7491 1727203978.29181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203978.30901: done with get_vars() 7491 1727203978.30933: done getting variables 7491 1727203978.30990: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show ipv4 routes] ******************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:48 Tuesday 24 September 2024 14:52:58 -0400 (0:00:00.110) 0:00:20.234 ***** 7491 1727203978.31027: entering _queue_task() for managed-node3/command 7491 1727203978.31370: worker is 1 (out of 1 available) 7491 1727203978.31383: exiting _queue_task() for managed-node3/command 7491 1727203978.31395: done queuing things up, now waiting for results queue to drain 7491 1727203978.31397: waiting for pending results... 7491 1727203978.31708: running TaskExecutor() for managed-node3/TASK: Show ipv4 routes 7491 1727203978.31822: in run() - task 0affcd87-79f5-0a4a-ad01-00000000005d 7491 1727203978.31849: variable 'ansible_search_path' from source: unknown 7491 1727203978.31899: calling self._execute() 7491 1727203978.32027: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203978.32041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203978.32062: variable 'omit' from source: magic vars 7491 1727203978.32484: variable 'ansible_distribution_major_version' from source: facts 7491 1727203978.32505: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203978.32522: variable 'omit' from source: magic vars 7491 1727203978.32551: variable 'omit' from source: magic vars 7491 1727203978.32596: variable 'omit' from source: magic vars 7491 1727203978.32648: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203978.32692: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203978.32727: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203978.32750: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203978.32770: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203978.32806: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203978.32819: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203978.32830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203978.32936: Set connection var ansible_timeout to 10 7491 1727203978.32948: Set connection var ansible_pipelining to False 7491 1727203978.32957: Set connection var ansible_shell_type to sh 7491 1727203978.32970: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203978.32983: Set connection var ansible_shell_executable to /bin/sh 7491 1727203978.32993: Set connection var ansible_connection to ssh 7491 1727203978.33024: variable 'ansible_shell_executable' from source: unknown 7491 1727203978.33032: variable 'ansible_connection' from source: unknown 7491 1727203978.33043: variable 'ansible_module_compression' from source: unknown 7491 1727203978.33051: variable 'ansible_shell_type' from source: unknown 7491 1727203978.33058: variable 'ansible_shell_executable' from source: unknown 7491 1727203978.33066: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203978.33075: variable 'ansible_pipelining' from source: unknown 7491 1727203978.33082: variable 'ansible_timeout' from source: unknown 7491 1727203978.33089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203978.33263: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203978.33285: variable 'omit' from source: magic vars 7491 1727203978.33295: starting attempt loop 7491 1727203978.33301: running the handler 7491 1727203978.33326: _low_level_execute_command(): starting 7491 1727203978.33340: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203978.34202: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203978.34222: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203978.34241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203978.34262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203978.34307: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203978.34322: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203978.34337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203978.34359: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203978.34374: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203978.34385: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203978.34398: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203978.34411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203978.34430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203978.34443: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203978.34455: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203978.34472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203978.34547: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203978.34567: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203978.34584: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203978.34667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203978.36296: stdout chunk (state=3): >>>/root <<< 7491 1727203978.36474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203978.36479: stdout chunk (state=3): >>><<< 7491 1727203978.36491: stderr chunk (state=3): >>><<< 7491 1727203978.36522: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203978.36532: _low_level_execute_command(): starting 7491 1727203978.36540: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203978.365179-8634-188934979070924 `" && echo ansible-tmp-1727203978.365179-8634-188934979070924="` echo /root/.ansible/tmp/ansible-tmp-1727203978.365179-8634-188934979070924 `" ) && sleep 0' 7491 1727203978.37200: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203978.37209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203978.37222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203978.37234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203978.37275: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203978.37282: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203978.37293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203978.37305: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203978.37312: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203978.37321: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203978.37326: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203978.37335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203978.37346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203978.37355: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203978.37360: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203978.37370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203978.37455: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203978.37474: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203978.37485: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203978.37555: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203978.39353: stdout chunk (state=3): >>>ansible-tmp-1727203978.365179-8634-188934979070924=/root/.ansible/tmp/ansible-tmp-1727203978.365179-8634-188934979070924 <<< 7491 1727203978.39476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203978.39579: stderr chunk (state=3): >>><<< 7491 1727203978.39603: stdout chunk (state=3): >>><<< 7491 1727203978.39678: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203978.365179-8634-188934979070924=/root/.ansible/tmp/ansible-tmp-1727203978.365179-8634-188934979070924 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203978.39682: variable 'ansible_module_compression' from source: unknown 7491 1727203978.39874: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7491 1727203978.39877: variable 'ansible_facts' from source: unknown 7491 1727203978.39880: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203978.365179-8634-188934979070924/AnsiballZ_command.py 7491 1727203978.40046: Sending initial data 7491 1727203978.40050: Sent initial data (153 bytes) 7491 1727203978.41221: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203978.41250: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203978.41269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203978.41295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203978.41375: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203978.41389: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203978.41406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203978.41429: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203978.41441: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203978.41452: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203978.41467: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203978.41503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203978.41530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203978.41567: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203978.41581: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203978.41596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203978.41706: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203978.41734: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203978.41763: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203978.41842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203978.43515: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203978.43544: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203978.43581: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpa1lnzmio /root/.ansible/tmp/ansible-tmp-1727203978.365179-8634-188934979070924/AnsiballZ_command.py <<< 7491 1727203978.43609: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203978.44480: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203978.44606: stderr chunk (state=3): >>><<< 7491 1727203978.44609: stdout chunk (state=3): >>><<< 7491 1727203978.44635: done transferring module to remote 7491 1727203978.44647: _low_level_execute_command(): starting 7491 1727203978.44657: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203978.365179-8634-188934979070924/ /root/.ansible/tmp/ansible-tmp-1727203978.365179-8634-188934979070924/AnsiballZ_command.py && sleep 0' 7491 1727203978.45408: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203978.45431: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203978.45447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203978.45462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203978.45496: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203978.45503: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203978.45511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203978.45521: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203978.45529: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203978.45534: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203978.45550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203978.45553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203978.45570: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203978.45614: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203978.45635: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203978.45682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203978.47343: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203978.47430: stderr chunk (state=3): >>><<< 7491 1727203978.47433: stdout chunk (state=3): >>><<< 7491 1727203978.47436: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203978.47441: _low_level_execute_command(): starting 7491 1727203978.47447: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203978.365179-8634-188934979070924/AnsiballZ_command.py && sleep 0' 7491 1727203978.48159: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203978.48163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203978.48214: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203978.48228: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203978.48242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203978.48266: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203978.48278: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203978.48288: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203978.48303: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203978.48318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203978.48334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203978.48346: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203978.48362: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203978.48378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203978.48456: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203978.48485: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203978.48503: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203978.48594: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203978.61845: stdout chunk (state=3): >>> {"changed": true, "stdout": "default via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.87 metric 100 \ndefault via 203.0.113.1 dev veth0 proto static metric 65535 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.87 metric 100 \n203.0.113.0/24 dev veth0 proto kernel scope link src 203.0.113.2 metric 65535 ", "stderr": "", "rc": 0, "cmd": ["ip", "route"], "start": "2024-09-24 14:52:58.614457", "end": "2024-09-24 14:52:58.617630", "delta": "0:00:00.003173", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7491 1727203978.62914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203978.62976: stderr chunk (state=3): >>><<< 7491 1727203978.62979: stdout chunk (state=3): >>><<< 7491 1727203978.62995: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "default via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.87 metric 100 \ndefault via 203.0.113.1 dev veth0 proto static metric 65535 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.87 metric 100 \n203.0.113.0/24 dev veth0 proto kernel scope link src 203.0.113.2 metric 65535 ", "stderr": "", "rc": 0, "cmd": ["ip", "route"], "start": "2024-09-24 14:52:58.614457", "end": "2024-09-24 14:52:58.617630", "delta": "0:00:00.003173", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203978.63029: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203978.365179-8634-188934979070924/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203978.63039: _low_level_execute_command(): starting 7491 1727203978.63043: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203978.365179-8634-188934979070924/ > /dev/null 2>&1 && sleep 0' 7491 1727203978.63514: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203978.63522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203978.63555: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203978.63570: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203978.63625: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203978.63636: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203978.63644: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203978.63699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203978.65406: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203978.65461: stderr chunk (state=3): >>><<< 7491 1727203978.65466: stdout chunk (state=3): >>><<< 7491 1727203978.65480: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203978.65486: handler run complete 7491 1727203978.65505: Evaluated conditional (False): False 7491 1727203978.65520: attempt loop complete, returning result 7491 1727203978.65523: _execute() done 7491 1727203978.65526: dumping result to json 7491 1727203978.65528: done dumping result, returning 7491 1727203978.65536: done running TaskExecutor() for managed-node3/TASK: Show ipv4 routes [0affcd87-79f5-0a4a-ad01-00000000005d] 7491 1727203978.65538: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000005d 7491 1727203978.65642: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000005d 7491 1727203978.65644: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "route" ], "delta": "0:00:00.003173", "end": "2024-09-24 14:52:58.617630", "rc": 0, "start": "2024-09-24 14:52:58.614457" } STDOUT: default via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.87 metric 100 default via 203.0.113.1 dev veth0 proto static metric 65535 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.87 metric 100 203.0.113.0/24 dev veth0 proto kernel scope link src 203.0.113.2 metric 65535 7491 1727203978.65715: no more pending results, returning what we have 7491 1727203978.65721: results queue empty 7491 1727203978.65722: checking for any_errors_fatal 7491 1727203978.65730: done checking for any_errors_fatal 7491 1727203978.65731: checking for max_fail_percentage 7491 1727203978.65732: done checking for max_fail_percentage 7491 1727203978.65733: checking to see if all hosts have failed and the running result is not ok 7491 1727203978.65735: done checking to see if all hosts have failed 7491 1727203978.65735: getting the remaining hosts for this loop 7491 1727203978.65737: done getting the remaining hosts for this loop 7491 1727203978.65741: getting the next task for host managed-node3 7491 1727203978.65745: done getting next task for host managed-node3 7491 1727203978.65748: ^ task is: TASK: Assert default ipv4 route is present 7491 1727203978.65750: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203978.65753: getting variables 7491 1727203978.65755: in VariableManager get_vars() 7491 1727203978.65804: Calling all_inventory to load vars for managed-node3 7491 1727203978.65806: Calling groups_inventory to load vars for managed-node3 7491 1727203978.65808: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203978.65821: Calling all_plugins_play to load vars for managed-node3 7491 1727203978.65823: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203978.65826: Calling groups_plugins_play to load vars for managed-node3 7491 1727203978.66721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203978.67658: done with get_vars() 7491 1727203978.67678: done getting variables 7491 1727203978.67727: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert default ipv4 route is present] ************************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:52 Tuesday 24 September 2024 14:52:58 -0400 (0:00:00.367) 0:00:20.601 ***** 7491 1727203978.67749: entering _queue_task() for managed-node3/assert 7491 1727203978.67980: worker is 1 (out of 1 available) 7491 1727203978.67994: exiting _queue_task() for managed-node3/assert 7491 1727203978.68009: done queuing things up, now waiting for results queue to drain 7491 1727203978.68010: waiting for pending results... 7491 1727203978.68191: running TaskExecutor() for managed-node3/TASK: Assert default ipv4 route is present 7491 1727203978.68259: in run() - task 0affcd87-79f5-0a4a-ad01-00000000005e 7491 1727203978.68273: variable 'ansible_search_path' from source: unknown 7491 1727203978.68302: calling self._execute() 7491 1727203978.68386: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203978.68390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203978.68400: variable 'omit' from source: magic vars 7491 1727203978.68682: variable 'ansible_distribution_major_version' from source: facts 7491 1727203978.68693: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203978.68700: variable 'omit' from source: magic vars 7491 1727203978.68715: variable 'omit' from source: magic vars 7491 1727203978.68742: variable 'omit' from source: magic vars 7491 1727203978.68779: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203978.68809: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203978.68829: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203978.68842: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203978.68851: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203978.68877: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203978.68880: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203978.68883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203978.68954: Set connection var ansible_timeout to 10 7491 1727203978.68958: Set connection var ansible_pipelining to False 7491 1727203978.68965: Set connection var ansible_shell_type to sh 7491 1727203978.68970: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203978.68976: Set connection var ansible_shell_executable to /bin/sh 7491 1727203978.68981: Set connection var ansible_connection to ssh 7491 1727203978.68999: variable 'ansible_shell_executable' from source: unknown 7491 1727203978.69001: variable 'ansible_connection' from source: unknown 7491 1727203978.69004: variable 'ansible_module_compression' from source: unknown 7491 1727203978.69007: variable 'ansible_shell_type' from source: unknown 7491 1727203978.69009: variable 'ansible_shell_executable' from source: unknown 7491 1727203978.69013: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203978.69015: variable 'ansible_pipelining' from source: unknown 7491 1727203978.69020: variable 'ansible_timeout' from source: unknown 7491 1727203978.69022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203978.69123: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203978.69132: variable 'omit' from source: magic vars 7491 1727203978.69135: starting attempt loop 7491 1727203978.69138: running the handler 7491 1727203978.69244: variable '__test_str' from source: task vars 7491 1727203978.69297: variable 'interface' from source: play vars 7491 1727203978.69304: variable 'ipv4_routes' from source: set_fact 7491 1727203978.69314: Evaluated conditional (__test_str in ipv4_routes.stdout): True 7491 1727203978.69321: handler run complete 7491 1727203978.69331: attempt loop complete, returning result 7491 1727203978.69334: _execute() done 7491 1727203978.69336: dumping result to json 7491 1727203978.69338: done dumping result, returning 7491 1727203978.69344: done running TaskExecutor() for managed-node3/TASK: Assert default ipv4 route is present [0affcd87-79f5-0a4a-ad01-00000000005e] 7491 1727203978.69350: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000005e 7491 1727203978.69437: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000005e 7491 1727203978.69440: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 7491 1727203978.69515: no more pending results, returning what we have 7491 1727203978.69521: results queue empty 7491 1727203978.69522: checking for any_errors_fatal 7491 1727203978.69532: done checking for any_errors_fatal 7491 1727203978.69532: checking for max_fail_percentage 7491 1727203978.69534: done checking for max_fail_percentage 7491 1727203978.69535: checking to see if all hosts have failed and the running result is not ok 7491 1727203978.69536: done checking to see if all hosts have failed 7491 1727203978.69537: getting the remaining hosts for this loop 7491 1727203978.69539: done getting the remaining hosts for this loop 7491 1727203978.69542: getting the next task for host managed-node3 7491 1727203978.69547: done getting next task for host managed-node3 7491 1727203978.69549: ^ task is: TASK: Get ipv6 routes 7491 1727203978.69551: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203978.69554: getting variables 7491 1727203978.69556: in VariableManager get_vars() 7491 1727203978.69610: Calling all_inventory to load vars for managed-node3 7491 1727203978.69613: Calling groups_inventory to load vars for managed-node3 7491 1727203978.69615: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203978.69626: Calling all_plugins_play to load vars for managed-node3 7491 1727203978.69628: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203978.69631: Calling groups_plugins_play to load vars for managed-node3 7491 1727203978.70442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203978.71379: done with get_vars() 7491 1727203978.71396: done getting variables 7491 1727203978.71443: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get ipv6 routes] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:57 Tuesday 24 September 2024 14:52:58 -0400 (0:00:00.037) 0:00:20.638 ***** 7491 1727203978.71466: entering _queue_task() for managed-node3/command 7491 1727203978.71690: worker is 1 (out of 1 available) 7491 1727203978.71704: exiting _queue_task() for managed-node3/command 7491 1727203978.71720: done queuing things up, now waiting for results queue to drain 7491 1727203978.71721: waiting for pending results... 7491 1727203978.71894: running TaskExecutor() for managed-node3/TASK: Get ipv6 routes 7491 1727203978.71947: in run() - task 0affcd87-79f5-0a4a-ad01-00000000005f 7491 1727203978.71966: variable 'ansible_search_path' from source: unknown 7491 1727203978.71995: calling self._execute() 7491 1727203978.72081: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203978.72086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203978.72096: variable 'omit' from source: magic vars 7491 1727203978.72379: variable 'ansible_distribution_major_version' from source: facts 7491 1727203978.72390: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203978.72402: variable 'omit' from source: magic vars 7491 1727203978.72415: variable 'omit' from source: magic vars 7491 1727203978.72441: variable 'omit' from source: magic vars 7491 1727203978.72476: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203978.72505: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203978.72525: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203978.72537: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203978.72546: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203978.72572: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203978.72575: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203978.72577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203978.72650: Set connection var ansible_timeout to 10 7491 1727203978.72654: Set connection var ansible_pipelining to False 7491 1727203978.72660: Set connection var ansible_shell_type to sh 7491 1727203978.72666: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203978.72672: Set connection var ansible_shell_executable to /bin/sh 7491 1727203978.72677: Set connection var ansible_connection to ssh 7491 1727203978.72695: variable 'ansible_shell_executable' from source: unknown 7491 1727203978.72698: variable 'ansible_connection' from source: unknown 7491 1727203978.72701: variable 'ansible_module_compression' from source: unknown 7491 1727203978.72703: variable 'ansible_shell_type' from source: unknown 7491 1727203978.72706: variable 'ansible_shell_executable' from source: unknown 7491 1727203978.72710: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203978.72712: variable 'ansible_pipelining' from source: unknown 7491 1727203978.72714: variable 'ansible_timeout' from source: unknown 7491 1727203978.72720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203978.72822: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203978.72834: variable 'omit' from source: magic vars 7491 1727203978.72837: starting attempt loop 7491 1727203978.72841: running the handler 7491 1727203978.72853: _low_level_execute_command(): starting 7491 1727203978.72860: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203978.73394: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203978.73399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203978.73428: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203978.73431: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203978.73435: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203978.73487: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203978.73490: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203978.73493: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203978.73545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203978.75093: stdout chunk (state=3): >>>/root <<< 7491 1727203978.75197: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203978.75252: stderr chunk (state=3): >>><<< 7491 1727203978.75257: stdout chunk (state=3): >>><<< 7491 1727203978.75280: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203978.75291: _low_level_execute_command(): starting 7491 1727203978.75296: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203978.7527983-8649-249824301066460 `" && echo ansible-tmp-1727203978.7527983-8649-249824301066460="` echo /root/.ansible/tmp/ansible-tmp-1727203978.7527983-8649-249824301066460 `" ) && sleep 0' 7491 1727203978.75757: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203978.75762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203978.75799: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203978.75807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 7491 1727203978.75821: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203978.75827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203978.75840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203978.75845: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203978.75903: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203978.75912: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203978.75920: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203978.75973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203978.77756: stdout chunk (state=3): >>>ansible-tmp-1727203978.7527983-8649-249824301066460=/root/.ansible/tmp/ansible-tmp-1727203978.7527983-8649-249824301066460 <<< 7491 1727203978.77866: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203978.77927: stderr chunk (state=3): >>><<< 7491 1727203978.77931: stdout chunk (state=3): >>><<< 7491 1727203978.77948: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203978.7527983-8649-249824301066460=/root/.ansible/tmp/ansible-tmp-1727203978.7527983-8649-249824301066460 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203978.77977: variable 'ansible_module_compression' from source: unknown 7491 1727203978.78019: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7491 1727203978.78054: variable 'ansible_facts' from source: unknown 7491 1727203978.78104: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203978.7527983-8649-249824301066460/AnsiballZ_command.py 7491 1727203978.78210: Sending initial data 7491 1727203978.78213: Sent initial data (154 bytes) 7491 1727203978.78906: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203978.78912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203978.78959: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203978.78965: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203978.78969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203978.79024: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203978.79033: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203978.79099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203978.80746: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203978.80787: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203978.80828: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpjt7d9ocq /root/.ansible/tmp/ansible-tmp-1727203978.7527983-8649-249824301066460/AnsiballZ_command.py <<< 7491 1727203978.80868: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203978.81652: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203978.81766: stderr chunk (state=3): >>><<< 7491 1727203978.81769: stdout chunk (state=3): >>><<< 7491 1727203978.81790: done transferring module to remote 7491 1727203978.81799: _low_level_execute_command(): starting 7491 1727203978.81804: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203978.7527983-8649-249824301066460/ /root/.ansible/tmp/ansible-tmp-1727203978.7527983-8649-249824301066460/AnsiballZ_command.py && sleep 0' 7491 1727203978.82258: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203978.82265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203978.82298: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203978.82311: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203978.82370: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203978.82386: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203978.82426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203978.84073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203978.84123: stderr chunk (state=3): >>><<< 7491 1727203978.84126: stdout chunk (state=3): >>><<< 7491 1727203978.84141: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203978.84144: _low_level_execute_command(): starting 7491 1727203978.84149: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203978.7527983-8649-249824301066460/AnsiballZ_command.py && sleep 0' 7491 1727203978.84603: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203978.84615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203978.84631: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203978.84643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203978.84652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203978.84704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203978.84707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203978.84773: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203978.98021: stdout chunk (state=3): >>> {"changed": true, "stdout": "::1 dev lo proto kernel metric 256 pref medium\n2001:db8::/64 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nfe80::/64 dev peerveth0 proto kernel metric 256 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-24 14:52:58.976038", "end": "2024-09-24 14:52:58.979433", "delta": "0:00:00.003395", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7491 1727203978.99193: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203978.99259: stderr chunk (state=3): >>><<< 7491 1727203978.99263: stdout chunk (state=3): >>><<< 7491 1727203978.99283: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "::1 dev lo proto kernel metric 256 pref medium\n2001:db8::/64 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nfe80::/64 dev peerveth0 proto kernel metric 256 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-24 14:52:58.976038", "end": "2024-09-24 14:52:58.979433", "delta": "0:00:00.003395", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203978.99315: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203978.7527983-8649-249824301066460/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203978.99327: _low_level_execute_command(): starting 7491 1727203978.99332: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203978.7527983-8649-249824301066460/ > /dev/null 2>&1 && sleep 0' 7491 1727203978.99805: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203978.99811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203978.99844: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203978.99860: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203978.99913: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203978.99929: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203978.99977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203979.01785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203979.01814: stderr chunk (state=3): >>><<< 7491 1727203979.01818: stdout chunk (state=3): >>><<< 7491 1727203979.01842: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203979.01848: handler run complete 7491 1727203979.01877: Evaluated conditional (False): False 7491 1727203979.01890: attempt loop complete, returning result 7491 1727203979.01893: _execute() done 7491 1727203979.01895: dumping result to json 7491 1727203979.01903: done dumping result, returning 7491 1727203979.01912: done running TaskExecutor() for managed-node3/TASK: Get ipv6 routes [0affcd87-79f5-0a4a-ad01-00000000005f] 7491 1727203979.01920: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000005f 7491 1727203979.02038: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000005f 7491 1727203979.02041: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "-6", "route" ], "delta": "0:00:00.003395", "end": "2024-09-24 14:52:58.979433", "rc": 0, "start": "2024-09-24 14:52:58.976038" } STDOUT: ::1 dev lo proto kernel metric 256 pref medium 2001:db8::/64 dev veth0 proto kernel metric 101 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium fe80::/64 dev peerveth0 proto kernel metric 256 pref medium fe80::/64 dev veth0 proto kernel metric 1024 pref medium default via 2001:db8::1 dev veth0 proto static metric 101 pref medium 7491 1727203979.02124: no more pending results, returning what we have 7491 1727203979.02128: results queue empty 7491 1727203979.02129: checking for any_errors_fatal 7491 1727203979.02134: done checking for any_errors_fatal 7491 1727203979.02135: checking for max_fail_percentage 7491 1727203979.02137: done checking for max_fail_percentage 7491 1727203979.02138: checking to see if all hosts have failed and the running result is not ok 7491 1727203979.02139: done checking to see if all hosts have failed 7491 1727203979.02140: getting the remaining hosts for this loop 7491 1727203979.02142: done getting the remaining hosts for this loop 7491 1727203979.02146: getting the next task for host managed-node3 7491 1727203979.02151: done getting next task for host managed-node3 7491 1727203979.02153: ^ task is: TASK: Assert default ipv6 route is present 7491 1727203979.02155: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203979.02159: getting variables 7491 1727203979.02160: in VariableManager get_vars() 7491 1727203979.02209: Calling all_inventory to load vars for managed-node3 7491 1727203979.02212: Calling groups_inventory to load vars for managed-node3 7491 1727203979.02218: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203979.02257: Calling all_plugins_play to load vars for managed-node3 7491 1727203979.02261: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203979.02266: Calling groups_plugins_play to load vars for managed-node3 7491 1727203979.03999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203979.04920: done with get_vars() 7491 1727203979.04939: done getting variables 7491 1727203979.04986: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert default ipv6 route is present] ************************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:61 Tuesday 24 September 2024 14:52:59 -0400 (0:00:00.335) 0:00:20.973 ***** 7491 1727203979.05008: entering _queue_task() for managed-node3/assert 7491 1727203979.05235: worker is 1 (out of 1 available) 7491 1727203979.05248: exiting _queue_task() for managed-node3/assert 7491 1727203979.05261: done queuing things up, now waiting for results queue to drain 7491 1727203979.05262: waiting for pending results... 7491 1727203979.05451: running TaskExecutor() for managed-node3/TASK: Assert default ipv6 route is present 7491 1727203979.05514: in run() - task 0affcd87-79f5-0a4a-ad01-000000000060 7491 1727203979.05530: variable 'ansible_search_path' from source: unknown 7491 1727203979.05559: calling self._execute() 7491 1727203979.05647: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203979.05652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203979.05662: variable 'omit' from source: magic vars 7491 1727203979.06084: variable 'ansible_distribution_major_version' from source: facts 7491 1727203979.06101: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203979.06228: variable 'network_provider' from source: set_fact 7491 1727203979.06239: Evaluated conditional (network_provider == "nm"): True 7491 1727203979.06256: variable 'omit' from source: magic vars 7491 1727203979.06284: variable 'omit' from source: magic vars 7491 1727203979.06731: variable 'omit' from source: magic vars 7491 1727203979.06773: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203979.06808: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203979.06833: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203979.06848: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203979.06860: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203979.06891: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203979.06895: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203979.06897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203979.07000: Set connection var ansible_timeout to 10 7491 1727203979.07003: Set connection var ansible_pipelining to False 7491 1727203979.07008: Set connection var ansible_shell_type to sh 7491 1727203979.07015: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203979.07025: Set connection var ansible_shell_executable to /bin/sh 7491 1727203979.07030: Set connection var ansible_connection to ssh 7491 1727203979.07055: variable 'ansible_shell_executable' from source: unknown 7491 1727203979.07058: variable 'ansible_connection' from source: unknown 7491 1727203979.07061: variable 'ansible_module_compression' from source: unknown 7491 1727203979.07063: variable 'ansible_shell_type' from source: unknown 7491 1727203979.07067: variable 'ansible_shell_executable' from source: unknown 7491 1727203979.07070: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203979.07082: variable 'ansible_pipelining' from source: unknown 7491 1727203979.07085: variable 'ansible_timeout' from source: unknown 7491 1727203979.07087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203979.07216: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203979.07230: variable 'omit' from source: magic vars 7491 1727203979.07233: starting attempt loop 7491 1727203979.07236: running the handler 7491 1727203979.07381: variable '__test_str' from source: task vars 7491 1727203979.07451: variable 'interface' from source: play vars 7491 1727203979.07459: variable 'ipv6_route' from source: set_fact 7491 1727203979.07473: Evaluated conditional (__test_str in ipv6_route.stdout): True 7491 1727203979.07479: handler run complete 7491 1727203979.07494: attempt loop complete, returning result 7491 1727203979.07497: _execute() done 7491 1727203979.07499: dumping result to json 7491 1727203979.07502: done dumping result, returning 7491 1727203979.07509: done running TaskExecutor() for managed-node3/TASK: Assert default ipv6 route is present [0affcd87-79f5-0a4a-ad01-000000000060] 7491 1727203979.07515: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000060 7491 1727203979.07612: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000060 7491 1727203979.07615: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 7491 1727203979.07660: no more pending results, returning what we have 7491 1727203979.07665: results queue empty 7491 1727203979.07666: checking for any_errors_fatal 7491 1727203979.07678: done checking for any_errors_fatal 7491 1727203979.07679: checking for max_fail_percentage 7491 1727203979.07681: done checking for max_fail_percentage 7491 1727203979.07682: checking to see if all hosts have failed and the running result is not ok 7491 1727203979.07683: done checking to see if all hosts have failed 7491 1727203979.07684: getting the remaining hosts for this loop 7491 1727203979.07685: done getting the remaining hosts for this loop 7491 1727203979.07689: getting the next task for host managed-node3 7491 1727203979.07695: done getting next task for host managed-node3 7491 1727203979.07698: ^ task is: TASK: TEARDOWN: remove profiles. 7491 1727203979.07700: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203979.07702: getting variables 7491 1727203979.07704: in VariableManager get_vars() 7491 1727203979.07752: Calling all_inventory to load vars for managed-node3 7491 1727203979.07755: Calling groups_inventory to load vars for managed-node3 7491 1727203979.07757: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203979.07769: Calling all_plugins_play to load vars for managed-node3 7491 1727203979.07771: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203979.07774: Calling groups_plugins_play to load vars for managed-node3 7491 1727203979.09165: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203979.10892: done with get_vars() 7491 1727203979.10926: done getting variables 7491 1727203979.11001: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEARDOWN: remove profiles.] ********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:67 Tuesday 24 September 2024 14:52:59 -0400 (0:00:00.060) 0:00:21.034 ***** 7491 1727203979.11034: entering _queue_task() for managed-node3/debug 7491 1727203979.11377: worker is 1 (out of 1 available) 7491 1727203979.11392: exiting _queue_task() for managed-node3/debug 7491 1727203979.11405: done queuing things up, now waiting for results queue to drain 7491 1727203979.11407: waiting for pending results... 7491 1727203979.11727: running TaskExecutor() for managed-node3/TASK: TEARDOWN: remove profiles. 7491 1727203979.11834: in run() - task 0affcd87-79f5-0a4a-ad01-000000000061 7491 1727203979.11863: variable 'ansible_search_path' from source: unknown 7491 1727203979.11909: calling self._execute() 7491 1727203979.12043: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203979.12055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203979.12078: variable 'omit' from source: magic vars 7491 1727203979.12483: variable 'ansible_distribution_major_version' from source: facts 7491 1727203979.12503: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203979.12524: variable 'omit' from source: magic vars 7491 1727203979.12550: variable 'omit' from source: magic vars 7491 1727203979.12593: variable 'omit' from source: magic vars 7491 1727203979.12648: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203979.12691: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203979.12719: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203979.12748: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203979.12767: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203979.12805: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203979.12813: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203979.12822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203979.12934: Set connection var ansible_timeout to 10 7491 1727203979.12955: Set connection var ansible_pipelining to False 7491 1727203979.12968: Set connection var ansible_shell_type to sh 7491 1727203979.12979: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203979.12991: Set connection var ansible_shell_executable to /bin/sh 7491 1727203979.13002: Set connection var ansible_connection to ssh 7491 1727203979.13029: variable 'ansible_shell_executable' from source: unknown 7491 1727203979.13038: variable 'ansible_connection' from source: unknown 7491 1727203979.13046: variable 'ansible_module_compression' from source: unknown 7491 1727203979.13061: variable 'ansible_shell_type' from source: unknown 7491 1727203979.13072: variable 'ansible_shell_executable' from source: unknown 7491 1727203979.13080: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203979.13088: variable 'ansible_pipelining' from source: unknown 7491 1727203979.13096: variable 'ansible_timeout' from source: unknown 7491 1727203979.13104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203979.13263: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203979.13285: variable 'omit' from source: magic vars 7491 1727203979.13293: starting attempt loop 7491 1727203979.13299: running the handler 7491 1727203979.13343: handler run complete 7491 1727203979.13365: attempt loop complete, returning result 7491 1727203979.13372: _execute() done 7491 1727203979.13383: dumping result to json 7491 1727203979.13390: done dumping result, returning 7491 1727203979.13398: done running TaskExecutor() for managed-node3/TASK: TEARDOWN: remove profiles. [0affcd87-79f5-0a4a-ad01-000000000061] 7491 1727203979.13406: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000061 ok: [managed-node3] => {} MSG: ################################################## 7491 1727203979.13542: no more pending results, returning what we have 7491 1727203979.13546: results queue empty 7491 1727203979.13547: checking for any_errors_fatal 7491 1727203979.13553: done checking for any_errors_fatal 7491 1727203979.13553: checking for max_fail_percentage 7491 1727203979.13555: done checking for max_fail_percentage 7491 1727203979.13556: checking to see if all hosts have failed and the running result is not ok 7491 1727203979.13557: done checking to see if all hosts have failed 7491 1727203979.13558: getting the remaining hosts for this loop 7491 1727203979.13560: done getting the remaining hosts for this loop 7491 1727203979.13565: getting the next task for host managed-node3 7491 1727203979.13574: done getting next task for host managed-node3 7491 1727203979.13580: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7491 1727203979.13584: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203979.13606: getting variables 7491 1727203979.13608: in VariableManager get_vars() 7491 1727203979.13661: Calling all_inventory to load vars for managed-node3 7491 1727203979.13665: Calling groups_inventory to load vars for managed-node3 7491 1727203979.13668: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203979.13679: Calling all_plugins_play to load vars for managed-node3 7491 1727203979.13682: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203979.13685: Calling groups_plugins_play to load vars for managed-node3 7491 1727203979.14738: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000061 7491 1727203979.14743: WORKER PROCESS EXITING 7491 1727203979.15638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203979.17347: done with get_vars() 7491 1727203979.17378: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:52:59 -0400 (0:00:00.064) 0:00:21.098 ***** 7491 1727203979.17484: entering _queue_task() for managed-node3/include_tasks 7491 1727203979.17798: worker is 1 (out of 1 available) 7491 1727203979.17812: exiting _queue_task() for managed-node3/include_tasks 7491 1727203979.17825: done queuing things up, now waiting for results queue to drain 7491 1727203979.17827: waiting for pending results... 7491 1727203979.18131: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7491 1727203979.18305: in run() - task 0affcd87-79f5-0a4a-ad01-000000000069 7491 1727203979.18334: variable 'ansible_search_path' from source: unknown 7491 1727203979.18341: variable 'ansible_search_path' from source: unknown 7491 1727203979.18391: calling self._execute() 7491 1727203979.18514: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203979.18530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203979.18549: variable 'omit' from source: magic vars 7491 1727203979.18979: variable 'ansible_distribution_major_version' from source: facts 7491 1727203979.19001: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203979.19012: _execute() done 7491 1727203979.19020: dumping result to json 7491 1727203979.19031: done dumping result, returning 7491 1727203979.19040: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-0a4a-ad01-000000000069] 7491 1727203979.19051: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000069 7491 1727203979.19203: no more pending results, returning what we have 7491 1727203979.19209: in VariableManager get_vars() 7491 1727203979.19274: Calling all_inventory to load vars for managed-node3 7491 1727203979.19278: Calling groups_inventory to load vars for managed-node3 7491 1727203979.19280: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203979.19294: Calling all_plugins_play to load vars for managed-node3 7491 1727203979.19297: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203979.19300: Calling groups_plugins_play to load vars for managed-node3 7491 1727203979.20313: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000069 7491 1727203979.20317: WORKER PROCESS EXITING 7491 1727203979.21028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203979.22962: done with get_vars() 7491 1727203979.22984: variable 'ansible_search_path' from source: unknown 7491 1727203979.22986: variable 'ansible_search_path' from source: unknown 7491 1727203979.23030: we have included files to process 7491 1727203979.23031: generating all_blocks data 7491 1727203979.23033: done generating all_blocks data 7491 1727203979.23043: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7491 1727203979.23045: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7491 1727203979.23047: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7491 1727203979.23636: done processing included file 7491 1727203979.23638: iterating over new_blocks loaded from include file 7491 1727203979.23640: in VariableManager get_vars() 7491 1727203979.23674: done with get_vars() 7491 1727203979.23676: filtering new block on tags 7491 1727203979.23698: done filtering new block on tags 7491 1727203979.23701: in VariableManager get_vars() 7491 1727203979.23729: done with get_vars() 7491 1727203979.23731: filtering new block on tags 7491 1727203979.23750: done filtering new block on tags 7491 1727203979.23753: in VariableManager get_vars() 7491 1727203979.23784: done with get_vars() 7491 1727203979.23786: filtering new block on tags 7491 1727203979.23806: done filtering new block on tags 7491 1727203979.23808: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 7491 1727203979.23813: extending task lists for all hosts with included blocks 7491 1727203979.24632: done extending task lists 7491 1727203979.24634: done processing included files 7491 1727203979.24635: results queue empty 7491 1727203979.24636: checking for any_errors_fatal 7491 1727203979.24640: done checking for any_errors_fatal 7491 1727203979.24641: checking for max_fail_percentage 7491 1727203979.24642: done checking for max_fail_percentage 7491 1727203979.24643: checking to see if all hosts have failed and the running result is not ok 7491 1727203979.24644: done checking to see if all hosts have failed 7491 1727203979.24645: getting the remaining hosts for this loop 7491 1727203979.24646: done getting the remaining hosts for this loop 7491 1727203979.24649: getting the next task for host managed-node3 7491 1727203979.24653: done getting next task for host managed-node3 7491 1727203979.24656: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7491 1727203979.24659: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203979.24671: getting variables 7491 1727203979.24672: in VariableManager get_vars() 7491 1727203979.24697: Calling all_inventory to load vars for managed-node3 7491 1727203979.24700: Calling groups_inventory to load vars for managed-node3 7491 1727203979.24702: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203979.24708: Calling all_plugins_play to load vars for managed-node3 7491 1727203979.24710: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203979.24714: Calling groups_plugins_play to load vars for managed-node3 7491 1727203979.26029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203979.27733: done with get_vars() 7491 1727203979.27759: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:52:59 -0400 (0:00:00.103) 0:00:21.202 ***** 7491 1727203979.27852: entering _queue_task() for managed-node3/setup 7491 1727203979.28188: worker is 1 (out of 1 available) 7491 1727203979.28200: exiting _queue_task() for managed-node3/setup 7491 1727203979.28212: done queuing things up, now waiting for results queue to drain 7491 1727203979.28214: waiting for pending results... 7491 1727203979.28515: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7491 1727203979.28701: in run() - task 0affcd87-79f5-0a4a-ad01-000000000d46 7491 1727203979.28724: variable 'ansible_search_path' from source: unknown 7491 1727203979.28732: variable 'ansible_search_path' from source: unknown 7491 1727203979.28778: calling self._execute() 7491 1727203979.28887: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203979.28899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203979.28920: variable 'omit' from source: magic vars 7491 1727203979.29312: variable 'ansible_distribution_major_version' from source: facts 7491 1727203979.29329: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203979.29565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203979.32142: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203979.32226: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203979.32273: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203979.32315: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203979.32351: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203979.32435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203979.32477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203979.32511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203979.32556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203979.32580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203979.32639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203979.32668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203979.32698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203979.32745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203979.32768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203979.32946: variable '__network_required_facts' from source: role '' defaults 7491 1727203979.32959: variable 'ansible_facts' from source: unknown 7491 1727203979.33710: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 7491 1727203979.33719: when evaluation is False, skipping this task 7491 1727203979.33726: _execute() done 7491 1727203979.33734: dumping result to json 7491 1727203979.33741: done dumping result, returning 7491 1727203979.33755: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-0a4a-ad01-000000000d46] 7491 1727203979.33771: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000d46 skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7491 1727203979.33932: no more pending results, returning what we have 7491 1727203979.33937: results queue empty 7491 1727203979.33938: checking for any_errors_fatal 7491 1727203979.33940: done checking for any_errors_fatal 7491 1727203979.33941: checking for max_fail_percentage 7491 1727203979.33943: done checking for max_fail_percentage 7491 1727203979.33944: checking to see if all hosts have failed and the running result is not ok 7491 1727203979.33945: done checking to see if all hosts have failed 7491 1727203979.33946: getting the remaining hosts for this loop 7491 1727203979.33948: done getting the remaining hosts for this loop 7491 1727203979.33953: getting the next task for host managed-node3 7491 1727203979.33965: done getting next task for host managed-node3 7491 1727203979.33970: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 7491 1727203979.33974: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203979.33993: getting variables 7491 1727203979.33997: in VariableManager get_vars() 7491 1727203979.34058: Calling all_inventory to load vars for managed-node3 7491 1727203979.34061: Calling groups_inventory to load vars for managed-node3 7491 1727203979.34066: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203979.34215: Calling all_plugins_play to load vars for managed-node3 7491 1727203979.34220: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203979.34225: Calling groups_plugins_play to load vars for managed-node3 7491 1727203979.35392: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000d46 7491 1727203979.35396: WORKER PROCESS EXITING 7491 1727203979.36484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203979.38200: done with get_vars() 7491 1727203979.38228: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:52:59 -0400 (0:00:00.104) 0:00:21.307 ***** 7491 1727203979.38344: entering _queue_task() for managed-node3/stat 7491 1727203979.38668: worker is 1 (out of 1 available) 7491 1727203979.38682: exiting _queue_task() for managed-node3/stat 7491 1727203979.38695: done queuing things up, now waiting for results queue to drain 7491 1727203979.38697: waiting for pending results... 7491 1727203979.38993: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 7491 1727203979.39177: in run() - task 0affcd87-79f5-0a4a-ad01-000000000d48 7491 1727203979.39200: variable 'ansible_search_path' from source: unknown 7491 1727203979.39209: variable 'ansible_search_path' from source: unknown 7491 1727203979.39252: calling self._execute() 7491 1727203979.39354: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203979.39372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203979.39392: variable 'omit' from source: magic vars 7491 1727203979.39779: variable 'ansible_distribution_major_version' from source: facts 7491 1727203979.39800: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203979.39982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727203979.40275: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727203979.40324: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727203979.40375: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727203979.40416: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727203979.40510: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727203979.40541: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727203979.40582: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203979.40614: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727203979.40710: variable '__network_is_ostree' from source: set_fact 7491 1727203979.40722: Evaluated conditional (not __network_is_ostree is defined): False 7491 1727203979.40730: when evaluation is False, skipping this task 7491 1727203979.40737: _execute() done 7491 1727203979.40743: dumping result to json 7491 1727203979.40751: done dumping result, returning 7491 1727203979.40762: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-0a4a-ad01-000000000d48] 7491 1727203979.40778: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000d48 skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7491 1727203979.40935: no more pending results, returning what we have 7491 1727203979.40939: results queue empty 7491 1727203979.40940: checking for any_errors_fatal 7491 1727203979.40947: done checking for any_errors_fatal 7491 1727203979.40948: checking for max_fail_percentage 7491 1727203979.40950: done checking for max_fail_percentage 7491 1727203979.40951: checking to see if all hosts have failed and the running result is not ok 7491 1727203979.40952: done checking to see if all hosts have failed 7491 1727203979.40953: getting the remaining hosts for this loop 7491 1727203979.40955: done getting the remaining hosts for this loop 7491 1727203979.40960: getting the next task for host managed-node3 7491 1727203979.40970: done getting next task for host managed-node3 7491 1727203979.40974: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7491 1727203979.40978: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203979.40997: getting variables 7491 1727203979.40999: in VariableManager get_vars() 7491 1727203979.41056: Calling all_inventory to load vars for managed-node3 7491 1727203979.41059: Calling groups_inventory to load vars for managed-node3 7491 1727203979.41062: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203979.41075: Calling all_plugins_play to load vars for managed-node3 7491 1727203979.41079: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203979.41082: Calling groups_plugins_play to load vars for managed-node3 7491 1727203979.42104: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000d48 7491 1727203979.42108: WORKER PROCESS EXITING 7491 1727203979.42793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203979.44539: done with get_vars() 7491 1727203979.44566: done getting variables 7491 1727203979.44630: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:52:59 -0400 (0:00:00.063) 0:00:21.370 ***** 7491 1727203979.44669: entering _queue_task() for managed-node3/set_fact 7491 1727203979.44969: worker is 1 (out of 1 available) 7491 1727203979.44982: exiting _queue_task() for managed-node3/set_fact 7491 1727203979.44994: done queuing things up, now waiting for results queue to drain 7491 1727203979.44995: waiting for pending results... 7491 1727203979.45293: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7491 1727203979.45469: in run() - task 0affcd87-79f5-0a4a-ad01-000000000d49 7491 1727203979.45497: variable 'ansible_search_path' from source: unknown 7491 1727203979.45506: variable 'ansible_search_path' from source: unknown 7491 1727203979.45552: calling self._execute() 7491 1727203979.45651: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203979.45665: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203979.45679: variable 'omit' from source: magic vars 7491 1727203979.46091: variable 'ansible_distribution_major_version' from source: facts 7491 1727203979.46113: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203979.46305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727203979.46602: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727203979.46656: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727203979.46702: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727203979.46741: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727203979.46839: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727203979.46878: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727203979.46914: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203979.46946: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727203979.47043: variable '__network_is_ostree' from source: set_fact 7491 1727203979.47057: Evaluated conditional (not __network_is_ostree is defined): False 7491 1727203979.47067: when evaluation is False, skipping this task 7491 1727203979.47077: _execute() done 7491 1727203979.47087: dumping result to json 7491 1727203979.47095: done dumping result, returning 7491 1727203979.47107: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-0a4a-ad01-000000000d49] 7491 1727203979.47119: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000d49 skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7491 1727203979.47277: no more pending results, returning what we have 7491 1727203979.47282: results queue empty 7491 1727203979.47283: checking for any_errors_fatal 7491 1727203979.47293: done checking for any_errors_fatal 7491 1727203979.47293: checking for max_fail_percentage 7491 1727203979.47296: done checking for max_fail_percentage 7491 1727203979.47297: checking to see if all hosts have failed and the running result is not ok 7491 1727203979.47298: done checking to see if all hosts have failed 7491 1727203979.47299: getting the remaining hosts for this loop 7491 1727203979.47302: done getting the remaining hosts for this loop 7491 1727203979.47306: getting the next task for host managed-node3 7491 1727203979.47317: done getting next task for host managed-node3 7491 1727203979.47321: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 7491 1727203979.47326: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203979.47345: getting variables 7491 1727203979.47347: in VariableManager get_vars() 7491 1727203979.47407: Calling all_inventory to load vars for managed-node3 7491 1727203979.47411: Calling groups_inventory to load vars for managed-node3 7491 1727203979.47414: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203979.47425: Calling all_plugins_play to load vars for managed-node3 7491 1727203979.47428: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203979.47431: Calling groups_plugins_play to load vars for managed-node3 7491 1727203979.48407: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000d49 7491 1727203979.48410: WORKER PROCESS EXITING 7491 1727203979.49331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203979.51005: done with get_vars() 7491 1727203979.51035: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:52:59 -0400 (0:00:00.064) 0:00:21.435 ***** 7491 1727203979.51141: entering _queue_task() for managed-node3/service_facts 7491 1727203979.51448: worker is 1 (out of 1 available) 7491 1727203979.51465: exiting _queue_task() for managed-node3/service_facts 7491 1727203979.51480: done queuing things up, now waiting for results queue to drain 7491 1727203979.51481: waiting for pending results... 7491 1727203979.51780: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 7491 1727203979.51945: in run() - task 0affcd87-79f5-0a4a-ad01-000000000d4b 7491 1727203979.51970: variable 'ansible_search_path' from source: unknown 7491 1727203979.51979: variable 'ansible_search_path' from source: unknown 7491 1727203979.52023: calling self._execute() 7491 1727203979.52135: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203979.52149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203979.52165: variable 'omit' from source: magic vars 7491 1727203979.52548: variable 'ansible_distribution_major_version' from source: facts 7491 1727203979.52572: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203979.52586: variable 'omit' from source: magic vars 7491 1727203979.52660: variable 'omit' from source: magic vars 7491 1727203979.52705: variable 'omit' from source: magic vars 7491 1727203979.52750: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203979.52801: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203979.52828: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203979.52850: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203979.52869: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203979.52912: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203979.52921: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203979.52929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203979.53042: Set connection var ansible_timeout to 10 7491 1727203979.53054: Set connection var ansible_pipelining to False 7491 1727203979.53066: Set connection var ansible_shell_type to sh 7491 1727203979.53078: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203979.53092: Set connection var ansible_shell_executable to /bin/sh 7491 1727203979.53109: Set connection var ansible_connection to ssh 7491 1727203979.53136: variable 'ansible_shell_executable' from source: unknown 7491 1727203979.53143: variable 'ansible_connection' from source: unknown 7491 1727203979.53151: variable 'ansible_module_compression' from source: unknown 7491 1727203979.53158: variable 'ansible_shell_type' from source: unknown 7491 1727203979.53167: variable 'ansible_shell_executable' from source: unknown 7491 1727203979.53175: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203979.53185: variable 'ansible_pipelining' from source: unknown 7491 1727203979.53191: variable 'ansible_timeout' from source: unknown 7491 1727203979.53200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203979.53412: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7491 1727203979.53438: variable 'omit' from source: magic vars 7491 1727203979.53449: starting attempt loop 7491 1727203979.53456: running the handler 7491 1727203979.53478: _low_level_execute_command(): starting 7491 1727203979.53491: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203979.54279: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203979.54299: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203979.54319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203979.54339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203979.54387: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203979.54402: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203979.54423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203979.54443: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203979.54456: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203979.54472: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203979.54486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203979.54501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203979.54524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203979.54542: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203979.54555: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203979.54574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203979.54659: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203979.54685: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203979.54703: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203979.54792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203979.56409: stdout chunk (state=3): >>>/root <<< 7491 1727203979.56510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203979.56586: stderr chunk (state=3): >>><<< 7491 1727203979.56590: stdout chunk (state=3): >>><<< 7491 1727203979.56614: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203979.56630: _low_level_execute_command(): starting 7491 1727203979.56636: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203979.5661402-8675-174878470077781 `" && echo ansible-tmp-1727203979.5661402-8675-174878470077781="` echo /root/.ansible/tmp/ansible-tmp-1727203979.5661402-8675-174878470077781 `" ) && sleep 0' 7491 1727203979.57300: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203979.57310: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203979.57323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203979.57338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203979.57376: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203979.57384: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203979.57395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203979.57407: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203979.57415: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203979.57425: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203979.57433: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203979.57442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203979.57453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203979.57460: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203979.57469: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203979.57478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203979.57552: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203979.57568: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203979.57578: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203979.57652: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203979.59456: stdout chunk (state=3): >>>ansible-tmp-1727203979.5661402-8675-174878470077781=/root/.ansible/tmp/ansible-tmp-1727203979.5661402-8675-174878470077781 <<< 7491 1727203979.59569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203979.59626: stderr chunk (state=3): >>><<< 7491 1727203979.59629: stdout chunk (state=3): >>><<< 7491 1727203979.59644: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203979.5661402-8675-174878470077781=/root/.ansible/tmp/ansible-tmp-1727203979.5661402-8675-174878470077781 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203979.59687: variable 'ansible_module_compression' from source: unknown 7491 1727203979.59729: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 7491 1727203979.59763: variable 'ansible_facts' from source: unknown 7491 1727203979.59817: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203979.5661402-8675-174878470077781/AnsiballZ_service_facts.py 7491 1727203979.59930: Sending initial data 7491 1727203979.59939: Sent initial data (160 bytes) 7491 1727203979.60622: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203979.60630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203979.60682: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203979.60686: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7491 1727203979.60700: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203979.60703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203979.60738: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203979.60784: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203979.60842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203979.62502: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203979.62535: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203979.62576: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmplcbhly97 /root/.ansible/tmp/ansible-tmp-1727203979.5661402-8675-174878470077781/AnsiballZ_service_facts.py <<< 7491 1727203979.62612: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203979.63430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203979.63567: stderr chunk (state=3): >>><<< 7491 1727203979.63570: stdout chunk (state=3): >>><<< 7491 1727203979.63599: done transferring module to remote 7491 1727203979.63602: _low_level_execute_command(): starting 7491 1727203979.63605: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203979.5661402-8675-174878470077781/ /root/.ansible/tmp/ansible-tmp-1727203979.5661402-8675-174878470077781/AnsiballZ_service_facts.py && sleep 0' 7491 1727203979.64212: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203979.64221: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203979.64231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203979.64245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203979.64284: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203979.64291: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203979.64303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203979.64314: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203979.64321: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203979.64329: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203979.64336: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203979.64345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203979.64356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203979.64366: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203979.64376: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203979.64385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203979.64456: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203979.64472: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203979.64481: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203979.64550: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203979.66230: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203979.66322: stderr chunk (state=3): >>><<< 7491 1727203979.66325: stdout chunk (state=3): >>><<< 7491 1727203979.66340: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203979.66343: _low_level_execute_command(): starting 7491 1727203979.66348: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203979.5661402-8675-174878470077781/AnsiballZ_service_facts.py && sleep 0' 7491 1727203979.66980: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203979.66988: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203979.66999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203979.67014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203979.67053: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203979.67060: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203979.67072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203979.67085: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203979.67093: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203979.67099: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203979.67107: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203979.67116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203979.67128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203979.67135: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203979.67141: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203979.67149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203979.67223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203979.67242: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203979.67252: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203979.67339: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203980.91832: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", <<< 7491 1727203980.91877: stdout chunk (state=3): >>>"source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "stat<<< 7491 1727203980.91894: stdout chunk (state=3): >>>e": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "syst<<< 7491 1727203980.91897: stdout chunk (state=3): >>>emd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap<<< 7491 1727203980.91901: stdout chunk (state=3): >>>.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "sy<<< 7491 1727203980.91904: stdout chunk (state=3): >>>stemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 7491 1727203980.93186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203980.93275: stderr chunk (state=3): >>><<< 7491 1727203980.93279: stdout chunk (state=3): >>><<< 7491 1727203980.93285: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203980.93927: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203979.5661402-8675-174878470077781/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203980.93943: _low_level_execute_command(): starting 7491 1727203980.93952: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203979.5661402-8675-174878470077781/ > /dev/null 2>&1 && sleep 0' 7491 1727203980.94611: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203980.94626: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203980.94641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203980.94658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203980.94705: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203980.94716: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203980.94733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203980.94750: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203980.94761: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203980.94779: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203980.94790: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203980.94803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203980.94817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203980.94829: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203980.94841: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203980.94854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203980.94932: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203980.94948: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203980.94966: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203980.95040: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203980.96860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203980.96866: stdout chunk (state=3): >>><<< 7491 1727203980.96875: stderr chunk (state=3): >>><<< 7491 1727203980.96891: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203980.96897: handler run complete 7491 1727203980.97085: variable 'ansible_facts' from source: unknown 7491 1727203980.97247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203980.97659: variable 'ansible_facts' from source: unknown 7491 1727203980.97782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203980.97960: attempt loop complete, returning result 7491 1727203980.97965: _execute() done 7491 1727203980.97968: dumping result to json 7491 1727203980.98025: done dumping result, returning 7491 1727203980.98034: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-0a4a-ad01-000000000d4b] 7491 1727203980.98041: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000d4b 7491 1727203980.98842: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000d4b 7491 1727203980.98845: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7491 1727203980.98923: no more pending results, returning what we have 7491 1727203980.98926: results queue empty 7491 1727203980.98927: checking for any_errors_fatal 7491 1727203980.98932: done checking for any_errors_fatal 7491 1727203980.98933: checking for max_fail_percentage 7491 1727203980.98934: done checking for max_fail_percentage 7491 1727203980.98935: checking to see if all hosts have failed and the running result is not ok 7491 1727203980.98936: done checking to see if all hosts have failed 7491 1727203980.98937: getting the remaining hosts for this loop 7491 1727203980.98939: done getting the remaining hosts for this loop 7491 1727203980.98942: getting the next task for host managed-node3 7491 1727203980.98947: done getting next task for host managed-node3 7491 1727203980.98951: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 7491 1727203980.98957: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203980.98968: getting variables 7491 1727203980.98969: in VariableManager get_vars() 7491 1727203980.99010: Calling all_inventory to load vars for managed-node3 7491 1727203980.99012: Calling groups_inventory to load vars for managed-node3 7491 1727203980.99015: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203980.99023: Calling all_plugins_play to load vars for managed-node3 7491 1727203980.99026: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203980.99034: Calling groups_plugins_play to load vars for managed-node3 7491 1727203981.00633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203981.02977: done with get_vars() 7491 1727203981.03009: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:53:01 -0400 (0:00:01.519) 0:00:22.954 ***** 7491 1727203981.03110: entering _queue_task() for managed-node3/package_facts 7491 1727203981.03425: worker is 1 (out of 1 available) 7491 1727203981.03438: exiting _queue_task() for managed-node3/package_facts 7491 1727203981.03451: done queuing things up, now waiting for results queue to drain 7491 1727203981.03452: waiting for pending results... 7491 1727203981.03754: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 7491 1727203981.03895: in run() - task 0affcd87-79f5-0a4a-ad01-000000000d4c 7491 1727203981.03916: variable 'ansible_search_path' from source: unknown 7491 1727203981.03920: variable 'ansible_search_path' from source: unknown 7491 1727203981.03955: calling self._execute() 7491 1727203981.04054: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203981.04058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203981.04073: variable 'omit' from source: magic vars 7491 1727203981.04441: variable 'ansible_distribution_major_version' from source: facts 7491 1727203981.04459: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203981.04467: variable 'omit' from source: magic vars 7491 1727203981.04551: variable 'omit' from source: magic vars 7491 1727203981.04591: variable 'omit' from source: magic vars 7491 1727203981.04702: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203981.04706: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203981.04709: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203981.04772: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203981.04775: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203981.04777: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203981.04779: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203981.04782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203981.04985: Set connection var ansible_timeout to 10 7491 1727203981.04989: Set connection var ansible_pipelining to False 7491 1727203981.04992: Set connection var ansible_shell_type to sh 7491 1727203981.04995: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203981.04997: Set connection var ansible_shell_executable to /bin/sh 7491 1727203981.05000: Set connection var ansible_connection to ssh 7491 1727203981.05002: variable 'ansible_shell_executable' from source: unknown 7491 1727203981.05005: variable 'ansible_connection' from source: unknown 7491 1727203981.05008: variable 'ansible_module_compression' from source: unknown 7491 1727203981.05010: variable 'ansible_shell_type' from source: unknown 7491 1727203981.05013: variable 'ansible_shell_executable' from source: unknown 7491 1727203981.05015: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203981.05018: variable 'ansible_pipelining' from source: unknown 7491 1727203981.05020: variable 'ansible_timeout' from source: unknown 7491 1727203981.05022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203981.05169: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7491 1727203981.05178: variable 'omit' from source: magic vars 7491 1727203981.05183: starting attempt loop 7491 1727203981.05186: running the handler 7491 1727203981.05201: _low_level_execute_command(): starting 7491 1727203981.05209: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203981.06113: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203981.06128: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203981.06256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203981.06273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203981.06312: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203981.06318: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203981.06332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203981.06344: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203981.06354: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203981.06363: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203981.06373: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203981.06387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203981.06400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203981.06408: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203981.06415: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203981.06429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203981.06537: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203981.06552: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203981.06562: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203981.06718: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203981.08151: stdout chunk (state=3): >>>/root <<< 7491 1727203981.08337: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203981.08343: stdout chunk (state=3): >>><<< 7491 1727203981.08352: stderr chunk (state=3): >>><<< 7491 1727203981.08379: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203981.08392: _low_level_execute_command(): starting 7491 1727203981.08399: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203981.0837831-8708-109869808194676 `" && echo ansible-tmp-1727203981.0837831-8708-109869808194676="` echo /root/.ansible/tmp/ansible-tmp-1727203981.0837831-8708-109869808194676 `" ) && sleep 0' 7491 1727203981.09255: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203981.09262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203981.09308: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203981.09327: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203981.09362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203981.09389: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203981.09403: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203981.09415: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203981.09430: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203981.09447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203981.09463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203981.09482: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203981.09495: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203981.11205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203981.11298: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203981.11367: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203981.11387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203981.11469: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203981.13274: stdout chunk (state=3): >>>ansible-tmp-1727203981.0837831-8708-109869808194676=/root/.ansible/tmp/ansible-tmp-1727203981.0837831-8708-109869808194676 <<< 7491 1727203981.13471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203981.13475: stdout chunk (state=3): >>><<< 7491 1727203981.13477: stderr chunk (state=3): >>><<< 7491 1727203981.13776: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203981.0837831-8708-109869808194676=/root/.ansible/tmp/ansible-tmp-1727203981.0837831-8708-109869808194676 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203981.13780: variable 'ansible_module_compression' from source: unknown 7491 1727203981.13782: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 7491 1727203981.13785: variable 'ansible_facts' from source: unknown 7491 1727203981.13857: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203981.0837831-8708-109869808194676/AnsiballZ_package_facts.py 7491 1727203981.14470: Sending initial data 7491 1727203981.14478: Sent initial data (160 bytes) 7491 1727203981.16349: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203981.16369: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203981.16392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203981.16425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203981.16473: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203981.16498: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203981.16516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203981.16535: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203981.16548: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203981.16559: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203981.16580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203981.16595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203981.16619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203981.16633: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203981.16651: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203981.16668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203981.16751: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203981.16778: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203981.16796: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203981.16872: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203981.18548: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203981.18588: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203981.18633: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmp_4gukiro /root/.ansible/tmp/ansible-tmp-1727203981.0837831-8708-109869808194676/AnsiballZ_package_facts.py <<< 7491 1727203981.18669: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203981.21918: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203981.22001: stderr chunk (state=3): >>><<< 7491 1727203981.22005: stdout chunk (state=3): >>><<< 7491 1727203981.22026: done transferring module to remote 7491 1727203981.22037: _low_level_execute_command(): starting 7491 1727203981.22042: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203981.0837831-8708-109869808194676/ /root/.ansible/tmp/ansible-tmp-1727203981.0837831-8708-109869808194676/AnsiballZ_package_facts.py && sleep 0' 7491 1727203981.23533: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203981.23665: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203981.23676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203981.23691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203981.23800: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203981.23807: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203981.23820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203981.23831: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203981.23843: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203981.23845: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203981.23852: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203981.23862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203981.23884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203981.23890: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203981.23899: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203981.23907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203981.23983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203981.24085: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203981.24094: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203981.24224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203981.25960: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203981.25965: stdout chunk (state=3): >>><<< 7491 1727203981.25973: stderr chunk (state=3): >>><<< 7491 1727203981.25991: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203981.25994: _low_level_execute_command(): starting 7491 1727203981.26000: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203981.0837831-8708-109869808194676/AnsiballZ_package_facts.py && sleep 0' 7491 1727203981.26921: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203981.26925: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203981.26933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203981.26947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203981.26991: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203981.26997: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203981.27007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203981.27021: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203981.27027: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203981.27034: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203981.27041: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203981.27050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203981.27065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203981.27075: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203981.27083: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203981.27096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203981.27168: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203981.27183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203981.27186: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203981.27279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203981.72915: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gli<<< 7491 1727203981.72943: stdout chunk (state=3): >>>bc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{<<< 7491 1727203981.72988: stdout chunk (state=3): >>>"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x<<< 7491 1727203981.73023: stdout chunk (state=3): >>>86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "rel<<< 7491 1727203981.73032: stdout chunk (state=3): >>>ease": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "rel<<< 7491 1727203981.73079: stdout chunk (state=3): >>>ease": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "r<<< 7491 1727203981.73097: stdout chunk (state=3): >>>elease": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles"<<< 7491 1727203981.73108: stdout chunk (state=3): >>>: [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pe<<< 7491 1727203981.73118: stdout chunk (state=3): >>>rl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}],<<< 7491 1727203981.73122: stdout chunk (state=3): >>> "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "<<< 7491 1727203981.73127: stdout chunk (state=3): >>>0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "s<<< 7491 1727203981.73129: stdout chunk (state=3): >>>ource": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el<<< 7491 1727203981.73153: stdout chunk (state=3): >>>9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 7491 1727203981.74685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203981.74762: stderr chunk (state=3): >>><<< 7491 1727203981.74767: stdout chunk (state=3): >>><<< 7491 1727203981.74815: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203981.77406: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203981.0837831-8708-109869808194676/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203981.77437: _low_level_execute_command(): starting 7491 1727203981.77447: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203981.0837831-8708-109869808194676/ > /dev/null 2>&1 && sleep 0' 7491 1727203981.78092: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203981.78110: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203981.78127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203981.78147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203981.78198: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203981.78210: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203981.78227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203981.78247: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203981.78259: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203981.78277: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203981.78290: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203981.78303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203981.78319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203981.78332: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203981.78344: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203981.78358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203981.78436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203981.78468: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203981.78472: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203981.78569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203981.80353: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203981.80431: stderr chunk (state=3): >>><<< 7491 1727203981.80435: stdout chunk (state=3): >>><<< 7491 1727203981.80450: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203981.80457: handler run complete 7491 1727203981.81380: variable 'ansible_facts' from source: unknown 7491 1727203981.81873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203981.84037: variable 'ansible_facts' from source: unknown 7491 1727203981.84491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203981.85257: attempt loop complete, returning result 7491 1727203981.85271: _execute() done 7491 1727203981.85274: dumping result to json 7491 1727203981.85513: done dumping result, returning 7491 1727203981.85524: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-0a4a-ad01-000000000d4c] 7491 1727203981.85530: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000d4c ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7491 1727203981.88075: no more pending results, returning what we have 7491 1727203981.88079: results queue empty 7491 1727203981.88080: checking for any_errors_fatal 7491 1727203981.88085: done checking for any_errors_fatal 7491 1727203981.88086: checking for max_fail_percentage 7491 1727203981.88088: done checking for max_fail_percentage 7491 1727203981.88088: checking to see if all hosts have failed and the running result is not ok 7491 1727203981.88089: done checking to see if all hosts have failed 7491 1727203981.88090: getting the remaining hosts for this loop 7491 1727203981.88091: done getting the remaining hosts for this loop 7491 1727203981.88095: getting the next task for host managed-node3 7491 1727203981.88100: done getting next task for host managed-node3 7491 1727203981.88104: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 7491 1727203981.88106: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203981.88116: getting variables 7491 1727203981.88117: in VariableManager get_vars() 7491 1727203981.88157: Calling all_inventory to load vars for managed-node3 7491 1727203981.88160: Calling groups_inventory to load vars for managed-node3 7491 1727203981.88162: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203981.88187: Calling all_plugins_play to load vars for managed-node3 7491 1727203981.88190: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203981.88194: Calling groups_plugins_play to load vars for managed-node3 7491 1727203981.88722: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000d4c 7491 1727203981.88726: WORKER PROCESS EXITING 7491 1727203981.88974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203981.90138: done with get_vars() 7491 1727203981.90178: done getting variables 7491 1727203981.90251: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:53:01 -0400 (0:00:00.871) 0:00:23.826 ***** 7491 1727203981.90290: entering _queue_task() for managed-node3/debug 7491 1727203981.90647: worker is 1 (out of 1 available) 7491 1727203981.90665: exiting _queue_task() for managed-node3/debug 7491 1727203981.90682: done queuing things up, now waiting for results queue to drain 7491 1727203981.90684: waiting for pending results... 7491 1727203981.91084: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 7491 1727203981.91187: in run() - task 0affcd87-79f5-0a4a-ad01-00000000006a 7491 1727203981.91199: variable 'ansible_search_path' from source: unknown 7491 1727203981.91202: variable 'ansible_search_path' from source: unknown 7491 1727203981.91238: calling self._execute() 7491 1727203981.91320: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203981.91329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203981.91337: variable 'omit' from source: magic vars 7491 1727203981.91619: variable 'ansible_distribution_major_version' from source: facts 7491 1727203981.91632: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203981.91639: variable 'omit' from source: magic vars 7491 1727203981.91685: variable 'omit' from source: magic vars 7491 1727203981.91759: variable 'network_provider' from source: set_fact 7491 1727203981.91775: variable 'omit' from source: magic vars 7491 1727203981.91811: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203981.91841: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203981.91859: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203981.91876: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203981.91885: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203981.91909: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203981.91912: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203981.91915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203981.91988: Set connection var ansible_timeout to 10 7491 1727203981.91994: Set connection var ansible_pipelining to False 7491 1727203981.91998: Set connection var ansible_shell_type to sh 7491 1727203981.92004: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203981.92011: Set connection var ansible_shell_executable to /bin/sh 7491 1727203981.92015: Set connection var ansible_connection to ssh 7491 1727203981.92037: variable 'ansible_shell_executable' from source: unknown 7491 1727203981.92040: variable 'ansible_connection' from source: unknown 7491 1727203981.92043: variable 'ansible_module_compression' from source: unknown 7491 1727203981.92045: variable 'ansible_shell_type' from source: unknown 7491 1727203981.92047: variable 'ansible_shell_executable' from source: unknown 7491 1727203981.92049: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203981.92053: variable 'ansible_pipelining' from source: unknown 7491 1727203981.92055: variable 'ansible_timeout' from source: unknown 7491 1727203981.92059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203981.92166: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203981.92175: variable 'omit' from source: magic vars 7491 1727203981.92180: starting attempt loop 7491 1727203981.92183: running the handler 7491 1727203981.92221: handler run complete 7491 1727203981.92233: attempt loop complete, returning result 7491 1727203981.92236: _execute() done 7491 1727203981.92239: dumping result to json 7491 1727203981.92241: done dumping result, returning 7491 1727203981.92248: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-0a4a-ad01-00000000006a] 7491 1727203981.92253: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000006a ok: [managed-node3] => {} MSG: Using network provider: nm 7491 1727203981.92403: no more pending results, returning what we have 7491 1727203981.92407: results queue empty 7491 1727203981.92409: checking for any_errors_fatal 7491 1727203981.92420: done checking for any_errors_fatal 7491 1727203981.92421: checking for max_fail_percentage 7491 1727203981.92423: done checking for max_fail_percentage 7491 1727203981.92424: checking to see if all hosts have failed and the running result is not ok 7491 1727203981.92425: done checking to see if all hosts have failed 7491 1727203981.92426: getting the remaining hosts for this loop 7491 1727203981.92428: done getting the remaining hosts for this loop 7491 1727203981.92432: getting the next task for host managed-node3 7491 1727203981.92438: done getting next task for host managed-node3 7491 1727203981.92442: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7491 1727203981.92446: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203981.92457: getting variables 7491 1727203981.92459: in VariableManager get_vars() 7491 1727203981.92505: Calling all_inventory to load vars for managed-node3 7491 1727203981.92508: Calling groups_inventory to load vars for managed-node3 7491 1727203981.92510: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203981.92520: Calling all_plugins_play to load vars for managed-node3 7491 1727203981.92522: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203981.92525: Calling groups_plugins_play to load vars for managed-node3 7491 1727203981.93086: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000006a 7491 1727203981.93089: WORKER PROCESS EXITING 7491 1727203981.93751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203981.95440: done with get_vars() 7491 1727203981.95458: done getting variables 7491 1727203981.95507: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:53:01 -0400 (0:00:00.052) 0:00:23.879 ***** 7491 1727203981.95537: entering _queue_task() for managed-node3/fail 7491 1727203981.95778: worker is 1 (out of 1 available) 7491 1727203981.95791: exiting _queue_task() for managed-node3/fail 7491 1727203981.95805: done queuing things up, now waiting for results queue to drain 7491 1727203981.95806: waiting for pending results... 7491 1727203981.95996: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7491 1727203981.96089: in run() - task 0affcd87-79f5-0a4a-ad01-00000000006b 7491 1727203981.96100: variable 'ansible_search_path' from source: unknown 7491 1727203981.96103: variable 'ansible_search_path' from source: unknown 7491 1727203981.96134: calling self._execute() 7491 1727203981.96215: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203981.96222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203981.96229: variable 'omit' from source: magic vars 7491 1727203981.96507: variable 'ansible_distribution_major_version' from source: facts 7491 1727203981.96519: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203981.96604: variable 'network_state' from source: role '' defaults 7491 1727203981.96611: Evaluated conditional (network_state != {}): False 7491 1727203981.96614: when evaluation is False, skipping this task 7491 1727203981.96619: _execute() done 7491 1727203981.96622: dumping result to json 7491 1727203981.96624: done dumping result, returning 7491 1727203981.96629: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-0a4a-ad01-00000000006b] 7491 1727203981.96636: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000006b 7491 1727203981.96726: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000006b 7491 1727203981.96729: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7491 1727203981.96781: no more pending results, returning what we have 7491 1727203981.96785: results queue empty 7491 1727203981.96786: checking for any_errors_fatal 7491 1727203981.96793: done checking for any_errors_fatal 7491 1727203981.96795: checking for max_fail_percentage 7491 1727203981.96797: done checking for max_fail_percentage 7491 1727203981.96798: checking to see if all hosts have failed and the running result is not ok 7491 1727203981.96799: done checking to see if all hosts have failed 7491 1727203981.96800: getting the remaining hosts for this loop 7491 1727203981.96802: done getting the remaining hosts for this loop 7491 1727203981.96806: getting the next task for host managed-node3 7491 1727203981.96811: done getting next task for host managed-node3 7491 1727203981.96818: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7491 1727203981.96821: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203981.96853: getting variables 7491 1727203981.96856: in VariableManager get_vars() 7491 1727203981.96923: Calling all_inventory to load vars for managed-node3 7491 1727203981.96927: Calling groups_inventory to load vars for managed-node3 7491 1727203981.96934: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203981.96957: Calling all_plugins_play to load vars for managed-node3 7491 1727203981.96962: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203981.96973: Calling groups_plugins_play to load vars for managed-node3 7491 1727203981.98527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203982.00169: done with get_vars() 7491 1727203982.00203: done getting variables 7491 1727203982.00270: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:53:02 -0400 (0:00:00.047) 0:00:23.926 ***** 7491 1727203982.00308: entering _queue_task() for managed-node3/fail 7491 1727203982.00646: worker is 1 (out of 1 available) 7491 1727203982.00661: exiting _queue_task() for managed-node3/fail 7491 1727203982.00677: done queuing things up, now waiting for results queue to drain 7491 1727203982.00678: waiting for pending results... 7491 1727203982.00971: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7491 1727203982.01076: in run() - task 0affcd87-79f5-0a4a-ad01-00000000006c 7491 1727203982.01087: variable 'ansible_search_path' from source: unknown 7491 1727203982.01090: variable 'ansible_search_path' from source: unknown 7491 1727203982.01122: calling self._execute() 7491 1727203982.01199: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203982.01204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203982.01213: variable 'omit' from source: magic vars 7491 1727203982.01497: variable 'ansible_distribution_major_version' from source: facts 7491 1727203982.01506: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203982.01596: variable 'network_state' from source: role '' defaults 7491 1727203982.01605: Evaluated conditional (network_state != {}): False 7491 1727203982.01607: when evaluation is False, skipping this task 7491 1727203982.01610: _execute() done 7491 1727203982.01614: dumping result to json 7491 1727203982.01616: done dumping result, returning 7491 1727203982.01625: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-0a4a-ad01-00000000006c] 7491 1727203982.01631: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000006c 7491 1727203982.01722: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000006c 7491 1727203982.01725: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7491 1727203982.01771: no more pending results, returning what we have 7491 1727203982.01775: results queue empty 7491 1727203982.01776: checking for any_errors_fatal 7491 1727203982.01782: done checking for any_errors_fatal 7491 1727203982.01782: checking for max_fail_percentage 7491 1727203982.01784: done checking for max_fail_percentage 7491 1727203982.01785: checking to see if all hosts have failed and the running result is not ok 7491 1727203982.01787: done checking to see if all hosts have failed 7491 1727203982.01787: getting the remaining hosts for this loop 7491 1727203982.01789: done getting the remaining hosts for this loop 7491 1727203982.01793: getting the next task for host managed-node3 7491 1727203982.01800: done getting next task for host managed-node3 7491 1727203982.01804: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7491 1727203982.01807: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203982.01828: getting variables 7491 1727203982.01829: in VariableManager get_vars() 7491 1727203982.01877: Calling all_inventory to load vars for managed-node3 7491 1727203982.01880: Calling groups_inventory to load vars for managed-node3 7491 1727203982.01882: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203982.01891: Calling all_plugins_play to load vars for managed-node3 7491 1727203982.01893: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203982.01896: Calling groups_plugins_play to load vars for managed-node3 7491 1727203982.02981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203982.08430: done with get_vars() 7491 1727203982.08453: done getting variables 7491 1727203982.08495: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:53:02 -0400 (0:00:00.082) 0:00:24.009 ***** 7491 1727203982.08519: entering _queue_task() for managed-node3/fail 7491 1727203982.08756: worker is 1 (out of 1 available) 7491 1727203982.08772: exiting _queue_task() for managed-node3/fail 7491 1727203982.08785: done queuing things up, now waiting for results queue to drain 7491 1727203982.08787: waiting for pending results... 7491 1727203982.08982: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7491 1727203982.09075: in run() - task 0affcd87-79f5-0a4a-ad01-00000000006d 7491 1727203982.09088: variable 'ansible_search_path' from source: unknown 7491 1727203982.09092: variable 'ansible_search_path' from source: unknown 7491 1727203982.09126: calling self._execute() 7491 1727203982.09209: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203982.09213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203982.09227: variable 'omit' from source: magic vars 7491 1727203982.09525: variable 'ansible_distribution_major_version' from source: facts 7491 1727203982.09536: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203982.09676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203982.11354: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203982.11413: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203982.11443: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203982.11468: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203982.11490: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203982.11552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203982.11574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203982.11592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203982.11717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203982.11721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203982.11734: variable 'ansible_distribution_major_version' from source: facts 7491 1727203982.11750: Evaluated conditional (ansible_distribution_major_version | int > 9): False 7491 1727203982.11754: when evaluation is False, skipping this task 7491 1727203982.11756: _execute() done 7491 1727203982.11759: dumping result to json 7491 1727203982.11761: done dumping result, returning 7491 1727203982.11773: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-0a4a-ad01-00000000006d] 7491 1727203982.11776: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000006d 7491 1727203982.11875: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000006d 7491 1727203982.11878: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 7491 1727203982.11927: no more pending results, returning what we have 7491 1727203982.11930: results queue empty 7491 1727203982.11931: checking for any_errors_fatal 7491 1727203982.11943: done checking for any_errors_fatal 7491 1727203982.11943: checking for max_fail_percentage 7491 1727203982.11945: done checking for max_fail_percentage 7491 1727203982.11946: checking to see if all hosts have failed and the running result is not ok 7491 1727203982.11948: done checking to see if all hosts have failed 7491 1727203982.11948: getting the remaining hosts for this loop 7491 1727203982.11951: done getting the remaining hosts for this loop 7491 1727203982.11954: getting the next task for host managed-node3 7491 1727203982.11961: done getting next task for host managed-node3 7491 1727203982.11967: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7491 1727203982.11970: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203982.11996: getting variables 7491 1727203982.11998: in VariableManager get_vars() 7491 1727203982.12046: Calling all_inventory to load vars for managed-node3 7491 1727203982.12049: Calling groups_inventory to load vars for managed-node3 7491 1727203982.12051: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203982.12060: Calling all_plugins_play to load vars for managed-node3 7491 1727203982.12063: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203982.12069: Calling groups_plugins_play to load vars for managed-node3 7491 1727203982.13671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203982.15252: done with get_vars() 7491 1727203982.15279: done getting variables 7491 1727203982.15329: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:53:02 -0400 (0:00:00.068) 0:00:24.077 ***** 7491 1727203982.15354: entering _queue_task() for managed-node3/dnf 7491 1727203982.15595: worker is 1 (out of 1 available) 7491 1727203982.15607: exiting _queue_task() for managed-node3/dnf 7491 1727203982.15623: done queuing things up, now waiting for results queue to drain 7491 1727203982.15625: waiting for pending results... 7491 1727203982.15814: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7491 1727203982.15915: in run() - task 0affcd87-79f5-0a4a-ad01-00000000006e 7491 1727203982.15927: variable 'ansible_search_path' from source: unknown 7491 1727203982.15932: variable 'ansible_search_path' from source: unknown 7491 1727203982.15965: calling self._execute() 7491 1727203982.16044: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203982.16048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203982.16058: variable 'omit' from source: magic vars 7491 1727203982.16345: variable 'ansible_distribution_major_version' from source: facts 7491 1727203982.16355: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203982.16504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203982.18143: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203982.18461: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203982.18490: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203982.18520: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203982.18539: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203982.18601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203982.18622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203982.18640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203982.18670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203982.18680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203982.18761: variable 'ansible_distribution' from source: facts 7491 1727203982.18767: variable 'ansible_distribution_major_version' from source: facts 7491 1727203982.18783: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 7491 1727203982.18869: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727203982.18956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203982.18974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203982.18993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203982.19022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203982.19032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203982.19060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203982.19078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203982.19095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203982.19125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203982.19135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203982.19162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203982.19180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203982.19196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203982.19226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203982.19236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203982.19340: variable 'network_connections' from source: task vars 7491 1727203982.19352: variable 'interface' from source: play vars 7491 1727203982.19400: variable 'interface' from source: play vars 7491 1727203982.19453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727203982.19568: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727203982.19598: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727203982.19622: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727203982.19642: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727203982.19691: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727203982.19707: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727203982.19730: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203982.19748: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727203982.19788: variable '__network_team_connections_defined' from source: role '' defaults 7491 1727203982.19938: variable 'network_connections' from source: task vars 7491 1727203982.19942: variable 'interface' from source: play vars 7491 1727203982.19992: variable 'interface' from source: play vars 7491 1727203982.20011: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7491 1727203982.20014: when evaluation is False, skipping this task 7491 1727203982.20019: _execute() done 7491 1727203982.20022: dumping result to json 7491 1727203982.20024: done dumping result, returning 7491 1727203982.20030: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-0a4a-ad01-00000000006e] 7491 1727203982.20035: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000006e 7491 1727203982.20139: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000006e 7491 1727203982.20142: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7491 1727203982.20194: no more pending results, returning what we have 7491 1727203982.20198: results queue empty 7491 1727203982.20199: checking for any_errors_fatal 7491 1727203982.20208: done checking for any_errors_fatal 7491 1727203982.20209: checking for max_fail_percentage 7491 1727203982.20211: done checking for max_fail_percentage 7491 1727203982.20212: checking to see if all hosts have failed and the running result is not ok 7491 1727203982.20213: done checking to see if all hosts have failed 7491 1727203982.20214: getting the remaining hosts for this loop 7491 1727203982.20219: done getting the remaining hosts for this loop 7491 1727203982.20223: getting the next task for host managed-node3 7491 1727203982.20229: done getting next task for host managed-node3 7491 1727203982.20234: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7491 1727203982.20236: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203982.20253: getting variables 7491 1727203982.20255: in VariableManager get_vars() 7491 1727203982.20305: Calling all_inventory to load vars for managed-node3 7491 1727203982.20308: Calling groups_inventory to load vars for managed-node3 7491 1727203982.20314: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203982.20326: Calling all_plugins_play to load vars for managed-node3 7491 1727203982.20328: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203982.20331: Calling groups_plugins_play to load vars for managed-node3 7491 1727203982.21296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203982.22227: done with get_vars() 7491 1727203982.22248: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7491 1727203982.22307: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:53:02 -0400 (0:00:00.069) 0:00:24.147 ***** 7491 1727203982.22333: entering _queue_task() for managed-node3/yum 7491 1727203982.22576: worker is 1 (out of 1 available) 7491 1727203982.22591: exiting _queue_task() for managed-node3/yum 7491 1727203982.22605: done queuing things up, now waiting for results queue to drain 7491 1727203982.22606: waiting for pending results... 7491 1727203982.22800: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7491 1727203982.22898: in run() - task 0affcd87-79f5-0a4a-ad01-00000000006f 7491 1727203982.22909: variable 'ansible_search_path' from source: unknown 7491 1727203982.22913: variable 'ansible_search_path' from source: unknown 7491 1727203982.22947: calling self._execute() 7491 1727203982.23022: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203982.23028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203982.23036: variable 'omit' from source: magic vars 7491 1727203982.23326: variable 'ansible_distribution_major_version' from source: facts 7491 1727203982.23337: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203982.23474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203982.25124: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203982.25179: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203982.25210: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203982.25238: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203982.25259: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203982.25319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203982.25341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203982.25359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203982.25387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203982.25398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203982.25471: variable 'ansible_distribution_major_version' from source: facts 7491 1727203982.25485: Evaluated conditional (ansible_distribution_major_version | int < 8): False 7491 1727203982.25488: when evaluation is False, skipping this task 7491 1727203982.25491: _execute() done 7491 1727203982.25493: dumping result to json 7491 1727203982.25496: done dumping result, returning 7491 1727203982.25504: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-0a4a-ad01-00000000006f] 7491 1727203982.25509: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000006f 7491 1727203982.25604: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000006f 7491 1727203982.25607: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 7491 1727203982.25665: no more pending results, returning what we have 7491 1727203982.25669: results queue empty 7491 1727203982.25670: checking for any_errors_fatal 7491 1727203982.25681: done checking for any_errors_fatal 7491 1727203982.25681: checking for max_fail_percentage 7491 1727203982.25683: done checking for max_fail_percentage 7491 1727203982.25684: checking to see if all hosts have failed and the running result is not ok 7491 1727203982.25685: done checking to see if all hosts have failed 7491 1727203982.25686: getting the remaining hosts for this loop 7491 1727203982.25688: done getting the remaining hosts for this loop 7491 1727203982.25692: getting the next task for host managed-node3 7491 1727203982.25698: done getting next task for host managed-node3 7491 1727203982.25702: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7491 1727203982.25705: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203982.25729: getting variables 7491 1727203982.25731: in VariableManager get_vars() 7491 1727203982.25783: Calling all_inventory to load vars for managed-node3 7491 1727203982.25786: Calling groups_inventory to load vars for managed-node3 7491 1727203982.25788: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203982.25797: Calling all_plugins_play to load vars for managed-node3 7491 1727203982.25800: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203982.25802: Calling groups_plugins_play to load vars for managed-node3 7491 1727203982.26625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203982.27575: done with get_vars() 7491 1727203982.27597: done getting variables 7491 1727203982.27643: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:53:02 -0400 (0:00:00.053) 0:00:24.200 ***** 7491 1727203982.27672: entering _queue_task() for managed-node3/fail 7491 1727203982.27911: worker is 1 (out of 1 available) 7491 1727203982.27928: exiting _queue_task() for managed-node3/fail 7491 1727203982.27941: done queuing things up, now waiting for results queue to drain 7491 1727203982.27943: waiting for pending results... 7491 1727203982.28145: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7491 1727203982.28244: in run() - task 0affcd87-79f5-0a4a-ad01-000000000070 7491 1727203982.28255: variable 'ansible_search_path' from source: unknown 7491 1727203982.28259: variable 'ansible_search_path' from source: unknown 7491 1727203982.28292: calling self._execute() 7491 1727203982.28376: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203982.28380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203982.28390: variable 'omit' from source: magic vars 7491 1727203982.28689: variable 'ansible_distribution_major_version' from source: facts 7491 1727203982.28699: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203982.28797: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727203982.28939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203982.30840: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203982.30889: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203982.30930: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203982.30957: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203982.30978: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203982.31037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203982.31060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203982.31080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203982.31108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203982.31123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203982.31156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203982.31177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203982.31193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203982.31224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203982.31235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203982.31264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203982.31283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203982.31299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203982.31327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203982.31337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203982.31456: variable 'network_connections' from source: task vars 7491 1727203982.31468: variable 'interface' from source: play vars 7491 1727203982.31527: variable 'interface' from source: play vars 7491 1727203982.31578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727203982.31694: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727203982.31724: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727203982.31748: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727203982.31779: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727203982.31812: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727203982.31832: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727203982.31849: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203982.31871: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727203982.31908: variable '__network_team_connections_defined' from source: role '' defaults 7491 1727203982.32071: variable 'network_connections' from source: task vars 7491 1727203982.32076: variable 'interface' from source: play vars 7491 1727203982.32124: variable 'interface' from source: play vars 7491 1727203982.32145: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7491 1727203982.32149: when evaluation is False, skipping this task 7491 1727203982.32152: _execute() done 7491 1727203982.32154: dumping result to json 7491 1727203982.32156: done dumping result, returning 7491 1727203982.32162: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-0a4a-ad01-000000000070] 7491 1727203982.32169: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000070 7491 1727203982.32275: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000070 7491 1727203982.32277: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7491 1727203982.32340: no more pending results, returning what we have 7491 1727203982.32344: results queue empty 7491 1727203982.32345: checking for any_errors_fatal 7491 1727203982.32355: done checking for any_errors_fatal 7491 1727203982.32356: checking for max_fail_percentage 7491 1727203982.32361: done checking for max_fail_percentage 7491 1727203982.32362: checking to see if all hosts have failed and the running result is not ok 7491 1727203982.32363: done checking to see if all hosts have failed 7491 1727203982.32364: getting the remaining hosts for this loop 7491 1727203982.32366: done getting the remaining hosts for this loop 7491 1727203982.32372: getting the next task for host managed-node3 7491 1727203982.32379: done getting next task for host managed-node3 7491 1727203982.32382: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 7491 1727203982.32385: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203982.32403: getting variables 7491 1727203982.32405: in VariableManager get_vars() 7491 1727203982.32448: Calling all_inventory to load vars for managed-node3 7491 1727203982.32451: Calling groups_inventory to load vars for managed-node3 7491 1727203982.32453: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203982.32470: Calling all_plugins_play to load vars for managed-node3 7491 1727203982.32473: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203982.32476: Calling groups_plugins_play to load vars for managed-node3 7491 1727203982.33385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203982.34315: done with get_vars() 7491 1727203982.34334: done getting variables 7491 1727203982.34380: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:53:02 -0400 (0:00:00.067) 0:00:24.267 ***** 7491 1727203982.34409: entering _queue_task() for managed-node3/package 7491 1727203982.34644: worker is 1 (out of 1 available) 7491 1727203982.34658: exiting _queue_task() for managed-node3/package 7491 1727203982.34673: done queuing things up, now waiting for results queue to drain 7491 1727203982.34675: waiting for pending results... 7491 1727203982.34870: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 7491 1727203982.34965: in run() - task 0affcd87-79f5-0a4a-ad01-000000000071 7491 1727203982.34977: variable 'ansible_search_path' from source: unknown 7491 1727203982.34983: variable 'ansible_search_path' from source: unknown 7491 1727203982.35013: calling self._execute() 7491 1727203982.35093: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203982.35097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203982.35107: variable 'omit' from source: magic vars 7491 1727203982.35393: variable 'ansible_distribution_major_version' from source: facts 7491 1727203982.35403: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203982.35548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727203982.35745: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727203982.35781: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727203982.35805: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727203982.35867: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727203982.35955: variable 'network_packages' from source: role '' defaults 7491 1727203982.36036: variable '__network_provider_setup' from source: role '' defaults 7491 1727203982.36044: variable '__network_service_name_default_nm' from source: role '' defaults 7491 1727203982.36094: variable '__network_service_name_default_nm' from source: role '' defaults 7491 1727203982.36102: variable '__network_packages_default_nm' from source: role '' defaults 7491 1727203982.36148: variable '__network_packages_default_nm' from source: role '' defaults 7491 1727203982.36271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203982.37753: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203982.37799: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203982.37831: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203982.37855: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203982.37876: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203982.37948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203982.37970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203982.37988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203982.38017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203982.38031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203982.38066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203982.38082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203982.38099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203982.38129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203982.38143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203982.38292: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7491 1727203982.38375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203982.38393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203982.38410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203982.38441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203982.38449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203982.38517: variable 'ansible_python' from source: facts 7491 1727203982.38539: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7491 1727203982.38601: variable '__network_wpa_supplicant_required' from source: role '' defaults 7491 1727203982.38661: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7491 1727203982.38750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203982.38766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203982.38789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203982.38815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203982.38829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203982.38861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203982.38889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203982.38908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203982.38937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203982.38947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203982.39051: variable 'network_connections' from source: task vars 7491 1727203982.39057: variable 'interface' from source: play vars 7491 1727203982.39138: variable 'interface' from source: play vars 7491 1727203982.39196: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727203982.39214: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727203982.39240: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203982.39262: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727203982.39299: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727203982.39491: variable 'network_connections' from source: task vars 7491 1727203982.39495: variable 'interface' from source: play vars 7491 1727203982.39577: variable 'interface' from source: play vars 7491 1727203982.39602: variable '__network_packages_default_wireless' from source: role '' defaults 7491 1727203982.39661: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727203982.39871: variable 'network_connections' from source: task vars 7491 1727203982.39874: variable 'interface' from source: play vars 7491 1727203982.39919: variable 'interface' from source: play vars 7491 1727203982.39938: variable '__network_packages_default_team' from source: role '' defaults 7491 1727203982.39997: variable '__network_team_connections_defined' from source: role '' defaults 7491 1727203982.40203: variable 'network_connections' from source: task vars 7491 1727203982.40207: variable 'interface' from source: play vars 7491 1727203982.40255: variable 'interface' from source: play vars 7491 1727203982.40297: variable '__network_service_name_default_initscripts' from source: role '' defaults 7491 1727203982.40343: variable '__network_service_name_default_initscripts' from source: role '' defaults 7491 1727203982.40348: variable '__network_packages_default_initscripts' from source: role '' defaults 7491 1727203982.40396: variable '__network_packages_default_initscripts' from source: role '' defaults 7491 1727203982.40539: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7491 1727203982.40851: variable 'network_connections' from source: task vars 7491 1727203982.40856: variable 'interface' from source: play vars 7491 1727203982.40901: variable 'interface' from source: play vars 7491 1727203982.40907: variable 'ansible_distribution' from source: facts 7491 1727203982.40910: variable '__network_rh_distros' from source: role '' defaults 7491 1727203982.40916: variable 'ansible_distribution_major_version' from source: facts 7491 1727203982.40930: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7491 1727203982.41046: variable 'ansible_distribution' from source: facts 7491 1727203982.41050: variable '__network_rh_distros' from source: role '' defaults 7491 1727203982.41053: variable 'ansible_distribution_major_version' from source: facts 7491 1727203982.41062: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7491 1727203982.41173: variable 'ansible_distribution' from source: facts 7491 1727203982.41178: variable '__network_rh_distros' from source: role '' defaults 7491 1727203982.41180: variable 'ansible_distribution_major_version' from source: facts 7491 1727203982.41209: variable 'network_provider' from source: set_fact 7491 1727203982.41223: variable 'ansible_facts' from source: unknown 7491 1727203982.41817: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 7491 1727203982.41820: when evaluation is False, skipping this task 7491 1727203982.41823: _execute() done 7491 1727203982.41826: dumping result to json 7491 1727203982.41831: done dumping result, returning 7491 1727203982.41838: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-0a4a-ad01-000000000071] 7491 1727203982.41843: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000071 7491 1727203982.41943: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000071 7491 1727203982.41946: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 7491 1727203982.41991: no more pending results, returning what we have 7491 1727203982.41996: results queue empty 7491 1727203982.41997: checking for any_errors_fatal 7491 1727203982.42005: done checking for any_errors_fatal 7491 1727203982.42006: checking for max_fail_percentage 7491 1727203982.42007: done checking for max_fail_percentage 7491 1727203982.42008: checking to see if all hosts have failed and the running result is not ok 7491 1727203982.42009: done checking to see if all hosts have failed 7491 1727203982.42010: getting the remaining hosts for this loop 7491 1727203982.42012: done getting the remaining hosts for this loop 7491 1727203982.42016: getting the next task for host managed-node3 7491 1727203982.42027: done getting next task for host managed-node3 7491 1727203982.42035: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7491 1727203982.42037: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203982.42056: getting variables 7491 1727203982.42057: in VariableManager get_vars() 7491 1727203982.42106: Calling all_inventory to load vars for managed-node3 7491 1727203982.42109: Calling groups_inventory to load vars for managed-node3 7491 1727203982.42111: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203982.42121: Calling all_plugins_play to load vars for managed-node3 7491 1727203982.42123: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203982.42131: Calling groups_plugins_play to load vars for managed-node3 7491 1727203982.42967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203982.44037: done with get_vars() 7491 1727203982.44054: done getting variables 7491 1727203982.44103: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:53:02 -0400 (0:00:00.097) 0:00:24.365 ***** 7491 1727203982.44128: entering _queue_task() for managed-node3/package 7491 1727203982.44361: worker is 1 (out of 1 available) 7491 1727203982.44377: exiting _queue_task() for managed-node3/package 7491 1727203982.44391: done queuing things up, now waiting for results queue to drain 7491 1727203982.44392: waiting for pending results... 7491 1727203982.44579: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7491 1727203982.44672: in run() - task 0affcd87-79f5-0a4a-ad01-000000000072 7491 1727203982.44684: variable 'ansible_search_path' from source: unknown 7491 1727203982.44688: variable 'ansible_search_path' from source: unknown 7491 1727203982.44718: calling self._execute() 7491 1727203982.44802: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203982.44806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203982.44815: variable 'omit' from source: magic vars 7491 1727203982.45114: variable 'ansible_distribution_major_version' from source: facts 7491 1727203982.45126: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203982.45215: variable 'network_state' from source: role '' defaults 7491 1727203982.45225: Evaluated conditional (network_state != {}): False 7491 1727203982.45229: when evaluation is False, skipping this task 7491 1727203982.45232: _execute() done 7491 1727203982.45234: dumping result to json 7491 1727203982.45236: done dumping result, returning 7491 1727203982.45244: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-0a4a-ad01-000000000072] 7491 1727203982.45251: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000072 7491 1727203982.45343: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000072 7491 1727203982.45345: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7491 1727203982.45418: no more pending results, returning what we have 7491 1727203982.45421: results queue empty 7491 1727203982.45422: checking for any_errors_fatal 7491 1727203982.45429: done checking for any_errors_fatal 7491 1727203982.45430: checking for max_fail_percentage 7491 1727203982.45432: done checking for max_fail_percentage 7491 1727203982.45433: checking to see if all hosts have failed and the running result is not ok 7491 1727203982.45434: done checking to see if all hosts have failed 7491 1727203982.45434: getting the remaining hosts for this loop 7491 1727203982.45437: done getting the remaining hosts for this loop 7491 1727203982.45440: getting the next task for host managed-node3 7491 1727203982.45445: done getting next task for host managed-node3 7491 1727203982.45449: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7491 1727203982.45452: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203982.45476: getting variables 7491 1727203982.45478: in VariableManager get_vars() 7491 1727203982.45522: Calling all_inventory to load vars for managed-node3 7491 1727203982.45525: Calling groups_inventory to load vars for managed-node3 7491 1727203982.45527: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203982.45535: Calling all_plugins_play to load vars for managed-node3 7491 1727203982.45538: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203982.45540: Calling groups_plugins_play to load vars for managed-node3 7491 1727203982.46321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203982.47250: done with get_vars() 7491 1727203982.47268: done getting variables 7491 1727203982.47313: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:53:02 -0400 (0:00:00.032) 0:00:24.397 ***** 7491 1727203982.47338: entering _queue_task() for managed-node3/package 7491 1727203982.47569: worker is 1 (out of 1 available) 7491 1727203982.47583: exiting _queue_task() for managed-node3/package 7491 1727203982.47597: done queuing things up, now waiting for results queue to drain 7491 1727203982.47599: waiting for pending results... 7491 1727203982.47793: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7491 1727203982.47888: in run() - task 0affcd87-79f5-0a4a-ad01-000000000073 7491 1727203982.47898: variable 'ansible_search_path' from source: unknown 7491 1727203982.47902: variable 'ansible_search_path' from source: unknown 7491 1727203982.47932: calling self._execute() 7491 1727203982.48019: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203982.48023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203982.48029: variable 'omit' from source: magic vars 7491 1727203982.48319: variable 'ansible_distribution_major_version' from source: facts 7491 1727203982.48328: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203982.48413: variable 'network_state' from source: role '' defaults 7491 1727203982.48422: Evaluated conditional (network_state != {}): False 7491 1727203982.48425: when evaluation is False, skipping this task 7491 1727203982.48428: _execute() done 7491 1727203982.48430: dumping result to json 7491 1727203982.48433: done dumping result, returning 7491 1727203982.48439: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-0a4a-ad01-000000000073] 7491 1727203982.48445: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000073 7491 1727203982.48539: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000073 7491 1727203982.48542: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7491 1727203982.48591: no more pending results, returning what we have 7491 1727203982.48595: results queue empty 7491 1727203982.48596: checking for any_errors_fatal 7491 1727203982.48604: done checking for any_errors_fatal 7491 1727203982.48605: checking for max_fail_percentage 7491 1727203982.48607: done checking for max_fail_percentage 7491 1727203982.48608: checking to see if all hosts have failed and the running result is not ok 7491 1727203982.48609: done checking to see if all hosts have failed 7491 1727203982.48610: getting the remaining hosts for this loop 7491 1727203982.48612: done getting the remaining hosts for this loop 7491 1727203982.48615: getting the next task for host managed-node3 7491 1727203982.48624: done getting next task for host managed-node3 7491 1727203982.48628: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7491 1727203982.48631: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203982.48646: getting variables 7491 1727203982.48648: in VariableManager get_vars() 7491 1727203982.48697: Calling all_inventory to load vars for managed-node3 7491 1727203982.48700: Calling groups_inventory to load vars for managed-node3 7491 1727203982.48702: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203982.48711: Calling all_plugins_play to load vars for managed-node3 7491 1727203982.48713: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203982.48718: Calling groups_plugins_play to load vars for managed-node3 7491 1727203982.49721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203982.50649: done with get_vars() 7491 1727203982.50670: done getting variables 7491 1727203982.50714: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:53:02 -0400 (0:00:00.034) 0:00:24.431 ***** 7491 1727203982.50744: entering _queue_task() for managed-node3/service 7491 1727203982.51105: worker is 1 (out of 1 available) 7491 1727203982.51117: exiting _queue_task() for managed-node3/service 7491 1727203982.51131: done queuing things up, now waiting for results queue to drain 7491 1727203982.51132: waiting for pending results... 7491 1727203982.51484: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7491 1727203982.51612: in run() - task 0affcd87-79f5-0a4a-ad01-000000000074 7491 1727203982.51633: variable 'ansible_search_path' from source: unknown 7491 1727203982.51637: variable 'ansible_search_path' from source: unknown 7491 1727203982.51675: calling self._execute() 7491 1727203982.51790: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203982.51797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203982.51809: variable 'omit' from source: magic vars 7491 1727203982.52367: variable 'ansible_distribution_major_version' from source: facts 7491 1727203982.52379: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203982.52479: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727203982.52616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203982.54288: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203982.54576: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203982.54580: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203982.54583: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203982.54585: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203982.54588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203982.54591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203982.54593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203982.54629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203982.54670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203982.55073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203982.55077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203982.55080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203982.55082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203982.55084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203982.55086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203982.55088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203982.55090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203982.55092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203982.55093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203982.55103: variable 'network_connections' from source: task vars 7491 1727203982.55118: variable 'interface' from source: play vars 7491 1727203982.55188: variable 'interface' from source: play vars 7491 1727203982.55265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727203982.55417: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727203982.55466: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727203982.55495: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727203982.55526: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727203982.55568: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727203982.55593: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727203982.55612: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203982.55641: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727203982.55686: variable '__network_team_connections_defined' from source: role '' defaults 7491 1727203982.55928: variable 'network_connections' from source: task vars 7491 1727203982.55932: variable 'interface' from source: play vars 7491 1727203982.55997: variable 'interface' from source: play vars 7491 1727203982.56025: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7491 1727203982.56029: when evaluation is False, skipping this task 7491 1727203982.56032: _execute() done 7491 1727203982.56034: dumping result to json 7491 1727203982.56036: done dumping result, returning 7491 1727203982.56045: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-0a4a-ad01-000000000074] 7491 1727203982.56051: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000074 7491 1727203982.56146: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000074 7491 1727203982.56154: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7491 1727203982.56202: no more pending results, returning what we have 7491 1727203982.56205: results queue empty 7491 1727203982.56206: checking for any_errors_fatal 7491 1727203982.56213: done checking for any_errors_fatal 7491 1727203982.56213: checking for max_fail_percentage 7491 1727203982.56215: done checking for max_fail_percentage 7491 1727203982.56216: checking to see if all hosts have failed and the running result is not ok 7491 1727203982.56217: done checking to see if all hosts have failed 7491 1727203982.56218: getting the remaining hosts for this loop 7491 1727203982.56220: done getting the remaining hosts for this loop 7491 1727203982.56223: getting the next task for host managed-node3 7491 1727203982.56229: done getting next task for host managed-node3 7491 1727203982.56233: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7491 1727203982.56236: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203982.56252: getting variables 7491 1727203982.56254: in VariableManager get_vars() 7491 1727203982.56303: Calling all_inventory to load vars for managed-node3 7491 1727203982.56305: Calling groups_inventory to load vars for managed-node3 7491 1727203982.56307: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203982.56316: Calling all_plugins_play to load vars for managed-node3 7491 1727203982.56319: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203982.56322: Calling groups_plugins_play to load vars for managed-node3 7491 1727203982.57738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203982.59490: done with get_vars() 7491 1727203982.59523: done getting variables 7491 1727203982.59580: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:53:02 -0400 (0:00:00.088) 0:00:24.520 ***** 7491 1727203982.59623: entering _queue_task() for managed-node3/service 7491 1727203982.59992: worker is 1 (out of 1 available) 7491 1727203982.60005: exiting _queue_task() for managed-node3/service 7491 1727203982.60020: done queuing things up, now waiting for results queue to drain 7491 1727203982.60021: waiting for pending results... 7491 1727203982.60342: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7491 1727203982.60503: in run() - task 0affcd87-79f5-0a4a-ad01-000000000075 7491 1727203982.60529: variable 'ansible_search_path' from source: unknown 7491 1727203982.60538: variable 'ansible_search_path' from source: unknown 7491 1727203982.60589: calling self._execute() 7491 1727203982.60709: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203982.60722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203982.60737: variable 'omit' from source: magic vars 7491 1727203982.61143: variable 'ansible_distribution_major_version' from source: facts 7491 1727203982.61162: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203982.61351: variable 'network_provider' from source: set_fact 7491 1727203982.61362: variable 'network_state' from source: role '' defaults 7491 1727203982.61380: Evaluated conditional (network_provider == "nm" or network_state != {}): True 7491 1727203982.61391: variable 'omit' from source: magic vars 7491 1727203982.61473: variable 'omit' from source: magic vars 7491 1727203982.61509: variable 'network_service_name' from source: role '' defaults 7491 1727203982.61588: variable 'network_service_name' from source: role '' defaults 7491 1727203982.61712: variable '__network_provider_setup' from source: role '' defaults 7491 1727203982.61724: variable '__network_service_name_default_nm' from source: role '' defaults 7491 1727203982.61800: variable '__network_service_name_default_nm' from source: role '' defaults 7491 1727203982.61813: variable '__network_packages_default_nm' from source: role '' defaults 7491 1727203982.61887: variable '__network_packages_default_nm' from source: role '' defaults 7491 1727203982.62135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203982.64867: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203982.64948: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203982.65007: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203982.65055: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203982.65090: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203982.65184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203982.65222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203982.65265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203982.65313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203982.65334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203982.65394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203982.65422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203982.65454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203982.65503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203982.65522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203982.65778: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7491 1727203982.65899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203982.65926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203982.65952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203982.66001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203982.66021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203982.66125: variable 'ansible_python' from source: facts 7491 1727203982.66152: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7491 1727203982.66251: variable '__network_wpa_supplicant_required' from source: role '' defaults 7491 1727203982.66345: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7491 1727203982.66488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203982.66519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203982.66557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203982.66603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203982.66621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203982.66682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203982.66718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203982.66752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203982.66804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203982.66822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203982.66984: variable 'network_connections' from source: task vars 7491 1727203982.66997: variable 'interface' from source: play vars 7491 1727203982.67081: variable 'interface' from source: play vars 7491 1727203982.67202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727203982.67420: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727203982.67478: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727203982.67536: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727203982.67583: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727203982.67659: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727203982.67699: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727203982.67745: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203982.67785: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727203982.67841: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727203982.68154: variable 'network_connections' from source: task vars 7491 1727203982.68172: variable 'interface' from source: play vars 7491 1727203982.68251: variable 'interface' from source: play vars 7491 1727203982.68301: variable '__network_packages_default_wireless' from source: role '' defaults 7491 1727203982.68395: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727203982.68721: variable 'network_connections' from source: task vars 7491 1727203982.68731: variable 'interface' from source: play vars 7491 1727203982.68806: variable 'interface' from source: play vars 7491 1727203982.68843: variable '__network_packages_default_team' from source: role '' defaults 7491 1727203982.68931: variable '__network_team_connections_defined' from source: role '' defaults 7491 1727203982.69225: variable 'network_connections' from source: task vars 7491 1727203982.69235: variable 'interface' from source: play vars 7491 1727203982.69321: variable 'interface' from source: play vars 7491 1727203982.69395: variable '__network_service_name_default_initscripts' from source: role '' defaults 7491 1727203982.69459: variable '__network_service_name_default_initscripts' from source: role '' defaults 7491 1727203982.69483: variable '__network_packages_default_initscripts' from source: role '' defaults 7491 1727203982.69549: variable '__network_packages_default_initscripts' from source: role '' defaults 7491 1727203982.69793: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7491 1727203982.70363: variable 'network_connections' from source: task vars 7491 1727203982.70375: variable 'interface' from source: play vars 7491 1727203982.70439: variable 'interface' from source: play vars 7491 1727203982.70459: variable 'ansible_distribution' from source: facts 7491 1727203982.70473: variable '__network_rh_distros' from source: role '' defaults 7491 1727203982.70485: variable 'ansible_distribution_major_version' from source: facts 7491 1727203982.70503: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7491 1727203982.70695: variable 'ansible_distribution' from source: facts 7491 1727203982.70705: variable '__network_rh_distros' from source: role '' defaults 7491 1727203982.70715: variable 'ansible_distribution_major_version' from source: facts 7491 1727203982.70732: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7491 1727203982.70928: variable 'ansible_distribution' from source: facts 7491 1727203982.70938: variable '__network_rh_distros' from source: role '' defaults 7491 1727203982.70948: variable 'ansible_distribution_major_version' from source: facts 7491 1727203982.70994: variable 'network_provider' from source: set_fact 7491 1727203982.71029: variable 'omit' from source: magic vars 7491 1727203982.71062: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203982.71096: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203982.71131: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203982.71154: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203982.71174: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203982.71213: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203982.71231: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203982.71240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203982.71360: Set connection var ansible_timeout to 10 7491 1727203982.71375: Set connection var ansible_pipelining to False 7491 1727203982.71386: Set connection var ansible_shell_type to sh 7491 1727203982.71397: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203982.71410: Set connection var ansible_shell_executable to /bin/sh 7491 1727203982.71418: Set connection var ansible_connection to ssh 7491 1727203982.71457: variable 'ansible_shell_executable' from source: unknown 7491 1727203982.71468: variable 'ansible_connection' from source: unknown 7491 1727203982.71476: variable 'ansible_module_compression' from source: unknown 7491 1727203982.71483: variable 'ansible_shell_type' from source: unknown 7491 1727203982.71490: variable 'ansible_shell_executable' from source: unknown 7491 1727203982.71497: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203982.71505: variable 'ansible_pipelining' from source: unknown 7491 1727203982.71512: variable 'ansible_timeout' from source: unknown 7491 1727203982.71518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203982.71636: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203982.71657: variable 'omit' from source: magic vars 7491 1727203982.71679: starting attempt loop 7491 1727203982.71688: running the handler 7491 1727203982.71784: variable 'ansible_facts' from source: unknown 7491 1727203982.72575: _low_level_execute_command(): starting 7491 1727203982.72587: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203982.73348: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203982.73367: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203982.73384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203982.73411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203982.73454: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203982.73468: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203982.73485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203982.73509: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203982.73525: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203982.73537: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203982.73550: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203982.73567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203982.73584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203982.73598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203982.73611: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203982.73631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203982.73708: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203982.73738: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203982.73758: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203982.73843: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203982.75474: stdout chunk (state=3): >>>/root <<< 7491 1727203982.75566: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203982.75666: stderr chunk (state=3): >>><<< 7491 1727203982.75680: stdout chunk (state=3): >>><<< 7491 1727203982.75807: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203982.75812: _low_level_execute_command(): starting 7491 1727203982.75815: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203982.7571082-8768-33132050287387 `" && echo ansible-tmp-1727203982.7571082-8768-33132050287387="` echo /root/.ansible/tmp/ansible-tmp-1727203982.7571082-8768-33132050287387 `" ) && sleep 0' 7491 1727203982.76432: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203982.76446: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203982.76468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203982.76486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203982.76530: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203982.76541: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203982.76554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203982.76580: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203982.76591: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203982.76601: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203982.76612: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203982.76625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203982.76639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203982.76649: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203982.76659: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203982.76677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203982.76755: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203982.76773: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203982.76793: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203982.76877: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203982.78692: stdout chunk (state=3): >>>ansible-tmp-1727203982.7571082-8768-33132050287387=/root/.ansible/tmp/ansible-tmp-1727203982.7571082-8768-33132050287387 <<< 7491 1727203982.78895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203982.78899: stdout chunk (state=3): >>><<< 7491 1727203982.78906: stderr chunk (state=3): >>><<< 7491 1727203982.78930: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203982.7571082-8768-33132050287387=/root/.ansible/tmp/ansible-tmp-1727203982.7571082-8768-33132050287387 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203982.78967: variable 'ansible_module_compression' from source: unknown 7491 1727203982.79027: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 7491 1727203982.79088: variable 'ansible_facts' from source: unknown 7491 1727203982.79285: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203982.7571082-8768-33132050287387/AnsiballZ_systemd.py 7491 1727203982.79452: Sending initial data 7491 1727203982.79455: Sent initial data (153 bytes) 7491 1727203982.80490: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203982.80497: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203982.80507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203982.80524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203982.80561: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203982.80569: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203982.80580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203982.80593: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203982.80601: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203982.80607: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203982.80614: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203982.80626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203982.80639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203982.80645: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203982.80651: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203982.80660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203982.80734: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203982.80754: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203982.80768: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203982.80839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203982.82540: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203982.82579: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203982.82620: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmp3amz__2v /root/.ansible/tmp/ansible-tmp-1727203982.7571082-8768-33132050287387/AnsiballZ_systemd.py <<< 7491 1727203982.82663: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203982.85102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203982.85192: stderr chunk (state=3): >>><<< 7491 1727203982.85197: stdout chunk (state=3): >>><<< 7491 1727203982.85222: done transferring module to remote 7491 1727203982.85233: _low_level_execute_command(): starting 7491 1727203982.85239: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203982.7571082-8768-33132050287387/ /root/.ansible/tmp/ansible-tmp-1727203982.7571082-8768-33132050287387/AnsiballZ_systemd.py && sleep 0' 7491 1727203982.85948: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203982.85957: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203982.85970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203982.85984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203982.86029: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203982.86039: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203982.86049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203982.86063: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203982.86073: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203982.86080: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203982.86088: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203982.86097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203982.86108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203982.86116: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203982.86127: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203982.86134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203982.86214: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203982.86231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203982.86245: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203982.86311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203982.88185: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203982.88189: stdout chunk (state=3): >>><<< 7491 1727203982.88192: stderr chunk (state=3): >>><<< 7491 1727203982.88293: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203982.88300: _low_level_execute_command(): starting 7491 1727203982.88303: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203982.7571082-8768-33132050287387/AnsiballZ_systemd.py && sleep 0' 7491 1727203982.89021: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203982.89038: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203982.89054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203982.89077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203982.89120: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203982.89135: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203982.89153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203982.89173: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203982.89186: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203982.89199: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203982.89211: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203982.89224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203982.89239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203982.89250: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203982.89260: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203982.89276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203982.89350: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203982.89369: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203982.89385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203982.89480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203983.14185: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "616", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ExecMainStartTimestampMonotonic": "12637094", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "616", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Tue 2024-09-24 14:47:46 EDT] ; stop_time=[n/a] ; pid=616 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Tue 2024-09-24 14:47:46 EDT] ; stop_time=[n/a] ; pid=616 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.sl<<< 7491 1727203983.14246: stdout chunk (state=3): >>>ice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "15101952", "MemoryAvailable": "infinity", "CPUUsageNSec": "135216000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "<<< 7491 1727203983.14255: stdout chunk (state=3): >>>SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.service shutdown.target NetworkManager-wait-online.service multi-user.target network.target", "After": "basic.target dbus.socket system.slice dbus-broker.service network-pre.target systemd-journald.socket cloud-init-local.service sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:47:46 EDT", "StateChangeTimestampMonotonic": "12973041", "InactiveExitTimestamp": "Tue 2024-09-24 14:47:46 EDT", "InactiveExitTimestampMonotonic": "12637298", "ActiveEnterTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ActiveEnterTimestampMonotonic": "12973041", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ConditionTimestampMonotonic": "12630855", "AssertTimestamp": "Tue 2024-09-24 14:47:46 EDT", "AssertTimestampMonotonic": "12630857", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f94263a9def7408cb754f60792d8c658", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 7491 1727203983.15667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203983.15760: stderr chunk (state=3): >>><<< 7491 1727203983.15767: stdout chunk (state=3): >>><<< 7491 1727203983.15974: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "616", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ExecMainStartTimestampMonotonic": "12637094", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "616", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Tue 2024-09-24 14:47:46 EDT] ; stop_time=[n/a] ; pid=616 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Tue 2024-09-24 14:47:46 EDT] ; stop_time=[n/a] ; pid=616 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "15101952", "MemoryAvailable": "infinity", "CPUUsageNSec": "135216000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.service shutdown.target NetworkManager-wait-online.service multi-user.target network.target", "After": "basic.target dbus.socket system.slice dbus-broker.service network-pre.target systemd-journald.socket cloud-init-local.service sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:47:46 EDT", "StateChangeTimestampMonotonic": "12973041", "InactiveExitTimestamp": "Tue 2024-09-24 14:47:46 EDT", "InactiveExitTimestampMonotonic": "12637298", "ActiveEnterTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ActiveEnterTimestampMonotonic": "12973041", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ConditionTimestampMonotonic": "12630855", "AssertTimestamp": "Tue 2024-09-24 14:47:46 EDT", "AssertTimestampMonotonic": "12630857", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f94263a9def7408cb754f60792d8c658", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203983.16083: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203982.7571082-8768-33132050287387/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203983.16087: _low_level_execute_command(): starting 7491 1727203983.16090: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203982.7571082-8768-33132050287387/ > /dev/null 2>&1 && sleep 0' 7491 1727203983.16738: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203983.16742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203983.16770: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203983.16773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203983.16790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203983.16793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203983.16854: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203983.16872: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203983.16953: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203983.18692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203983.18749: stderr chunk (state=3): >>><<< 7491 1727203983.18751: stdout chunk (state=3): >>><<< 7491 1727203983.18770: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203983.18775: handler run complete 7491 1727203983.18813: attempt loop complete, returning result 7491 1727203983.18816: _execute() done 7491 1727203983.18818: dumping result to json 7491 1727203983.18831: done dumping result, returning 7491 1727203983.18840: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-0a4a-ad01-000000000075] 7491 1727203983.18845: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000075 7491 1727203983.19082: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000075 7491 1727203983.19085: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7491 1727203983.19137: no more pending results, returning what we have 7491 1727203983.19141: results queue empty 7491 1727203983.19142: checking for any_errors_fatal 7491 1727203983.19147: done checking for any_errors_fatal 7491 1727203983.19148: checking for max_fail_percentage 7491 1727203983.19150: done checking for max_fail_percentage 7491 1727203983.19151: checking to see if all hosts have failed and the running result is not ok 7491 1727203983.19152: done checking to see if all hosts have failed 7491 1727203983.19152: getting the remaining hosts for this loop 7491 1727203983.19154: done getting the remaining hosts for this loop 7491 1727203983.19157: getting the next task for host managed-node3 7491 1727203983.19162: done getting next task for host managed-node3 7491 1727203983.19168: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7491 1727203983.19171: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203983.19182: getting variables 7491 1727203983.19184: in VariableManager get_vars() 7491 1727203983.19226: Calling all_inventory to load vars for managed-node3 7491 1727203983.19228: Calling groups_inventory to load vars for managed-node3 7491 1727203983.19230: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203983.19239: Calling all_plugins_play to load vars for managed-node3 7491 1727203983.19242: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203983.19244: Calling groups_plugins_play to load vars for managed-node3 7491 1727203983.20167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203983.21091: done with get_vars() 7491 1727203983.21111: done getting variables 7491 1727203983.21159: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:53:03 -0400 (0:00:00.615) 0:00:25.135 ***** 7491 1727203983.21186: entering _queue_task() for managed-node3/service 7491 1727203983.21425: worker is 1 (out of 1 available) 7491 1727203983.21438: exiting _queue_task() for managed-node3/service 7491 1727203983.21451: done queuing things up, now waiting for results queue to drain 7491 1727203983.21453: waiting for pending results... 7491 1727203983.21651: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7491 1727203983.21743: in run() - task 0affcd87-79f5-0a4a-ad01-000000000076 7491 1727203983.21753: variable 'ansible_search_path' from source: unknown 7491 1727203983.21756: variable 'ansible_search_path' from source: unknown 7491 1727203983.21789: calling self._execute() 7491 1727203983.21875: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203983.21879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203983.21888: variable 'omit' from source: magic vars 7491 1727203983.22189: variable 'ansible_distribution_major_version' from source: facts 7491 1727203983.22199: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203983.22291: variable 'network_provider' from source: set_fact 7491 1727203983.22295: Evaluated conditional (network_provider == "nm"): True 7491 1727203983.22368: variable '__network_wpa_supplicant_required' from source: role '' defaults 7491 1727203983.22432: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7491 1727203983.22558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203983.24113: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203983.24159: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203983.24193: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203983.24226: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203983.24245: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203983.24316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203983.24339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203983.24356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203983.24385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203983.24397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203983.24438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203983.24454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203983.24473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203983.24498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203983.24512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203983.24543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203983.24559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203983.24581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203983.24605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203983.24618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203983.24729: variable 'network_connections' from source: task vars 7491 1727203983.24742: variable 'interface' from source: play vars 7491 1727203983.24793: variable 'interface' from source: play vars 7491 1727203983.24851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727203983.24966: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727203983.24994: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727203983.25017: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727203983.25040: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727203983.25076: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727203983.25092: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727203983.25109: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203983.25129: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727203983.25170: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727203983.25331: variable 'network_connections' from source: task vars 7491 1727203983.25335: variable 'interface' from source: play vars 7491 1727203983.25383: variable 'interface' from source: play vars 7491 1727203983.25406: Evaluated conditional (__network_wpa_supplicant_required): False 7491 1727203983.25409: when evaluation is False, skipping this task 7491 1727203983.25412: _execute() done 7491 1727203983.25414: dumping result to json 7491 1727203983.25417: done dumping result, returning 7491 1727203983.25427: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-0a4a-ad01-000000000076] 7491 1727203983.25439: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000076 7491 1727203983.25527: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000076 7491 1727203983.25530: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 7491 1727203983.25574: no more pending results, returning what we have 7491 1727203983.25578: results queue empty 7491 1727203983.25579: checking for any_errors_fatal 7491 1727203983.25609: done checking for any_errors_fatal 7491 1727203983.25610: checking for max_fail_percentage 7491 1727203983.25612: done checking for max_fail_percentage 7491 1727203983.25613: checking to see if all hosts have failed and the running result is not ok 7491 1727203983.25614: done checking to see if all hosts have failed 7491 1727203983.25615: getting the remaining hosts for this loop 7491 1727203983.25617: done getting the remaining hosts for this loop 7491 1727203983.25621: getting the next task for host managed-node3 7491 1727203983.25627: done getting next task for host managed-node3 7491 1727203983.25631: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 7491 1727203983.25634: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203983.25652: getting variables 7491 1727203983.25654: in VariableManager get_vars() 7491 1727203983.25708: Calling all_inventory to load vars for managed-node3 7491 1727203983.25711: Calling groups_inventory to load vars for managed-node3 7491 1727203983.25713: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203983.25723: Calling all_plugins_play to load vars for managed-node3 7491 1727203983.25726: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203983.25728: Calling groups_plugins_play to load vars for managed-node3 7491 1727203983.26553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203983.27588: done with get_vars() 7491 1727203983.27608: done getting variables 7491 1727203983.27655: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:53:03 -0400 (0:00:00.064) 0:00:25.200 ***** 7491 1727203983.27683: entering _queue_task() for managed-node3/service 7491 1727203983.27917: worker is 1 (out of 1 available) 7491 1727203983.27931: exiting _queue_task() for managed-node3/service 7491 1727203983.27945: done queuing things up, now waiting for results queue to drain 7491 1727203983.27946: waiting for pending results... 7491 1727203983.28142: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 7491 1727203983.28241: in run() - task 0affcd87-79f5-0a4a-ad01-000000000077 7491 1727203983.28251: variable 'ansible_search_path' from source: unknown 7491 1727203983.28255: variable 'ansible_search_path' from source: unknown 7491 1727203983.28288: calling self._execute() 7491 1727203983.28369: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203983.28379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203983.28390: variable 'omit' from source: magic vars 7491 1727203983.28683: variable 'ansible_distribution_major_version' from source: facts 7491 1727203983.28693: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203983.28779: variable 'network_provider' from source: set_fact 7491 1727203983.28783: Evaluated conditional (network_provider == "initscripts"): False 7491 1727203983.28786: when evaluation is False, skipping this task 7491 1727203983.28789: _execute() done 7491 1727203983.28791: dumping result to json 7491 1727203983.28796: done dumping result, returning 7491 1727203983.28803: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-0a4a-ad01-000000000077] 7491 1727203983.28809: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000077 7491 1727203983.28900: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000077 7491 1727203983.28902: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7491 1727203983.28968: no more pending results, returning what we have 7491 1727203983.28972: results queue empty 7491 1727203983.28973: checking for any_errors_fatal 7491 1727203983.28983: done checking for any_errors_fatal 7491 1727203983.28983: checking for max_fail_percentage 7491 1727203983.28985: done checking for max_fail_percentage 7491 1727203983.28986: checking to see if all hosts have failed and the running result is not ok 7491 1727203983.28987: done checking to see if all hosts have failed 7491 1727203983.28988: getting the remaining hosts for this loop 7491 1727203983.28990: done getting the remaining hosts for this loop 7491 1727203983.28994: getting the next task for host managed-node3 7491 1727203983.28999: done getting next task for host managed-node3 7491 1727203983.29003: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7491 1727203983.29006: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203983.29029: getting variables 7491 1727203983.29031: in VariableManager get_vars() 7491 1727203983.29076: Calling all_inventory to load vars for managed-node3 7491 1727203983.29079: Calling groups_inventory to load vars for managed-node3 7491 1727203983.29081: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203983.29090: Calling all_plugins_play to load vars for managed-node3 7491 1727203983.29092: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203983.29095: Calling groups_plugins_play to load vars for managed-node3 7491 1727203983.29898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203983.30820: done with get_vars() 7491 1727203983.30839: done getting variables 7491 1727203983.30888: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:53:03 -0400 (0:00:00.032) 0:00:25.233 ***** 7491 1727203983.30914: entering _queue_task() for managed-node3/copy 7491 1727203983.31149: worker is 1 (out of 1 available) 7491 1727203983.31166: exiting _queue_task() for managed-node3/copy 7491 1727203983.31180: done queuing things up, now waiting for results queue to drain 7491 1727203983.31181: waiting for pending results... 7491 1727203983.31371: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7491 1727203983.31471: in run() - task 0affcd87-79f5-0a4a-ad01-000000000078 7491 1727203983.31482: variable 'ansible_search_path' from source: unknown 7491 1727203983.31486: variable 'ansible_search_path' from source: unknown 7491 1727203983.31515: calling self._execute() 7491 1727203983.31599: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203983.31603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203983.31611: variable 'omit' from source: magic vars 7491 1727203983.31904: variable 'ansible_distribution_major_version' from source: facts 7491 1727203983.31914: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203983.32002: variable 'network_provider' from source: set_fact 7491 1727203983.32005: Evaluated conditional (network_provider == "initscripts"): False 7491 1727203983.32008: when evaluation is False, skipping this task 7491 1727203983.32011: _execute() done 7491 1727203983.32015: dumping result to json 7491 1727203983.32021: done dumping result, returning 7491 1727203983.32028: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-0a4a-ad01-000000000078] 7491 1727203983.32035: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000078 7491 1727203983.32123: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000078 7491 1727203983.32126: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 7491 1727203983.32177: no more pending results, returning what we have 7491 1727203983.32181: results queue empty 7491 1727203983.32182: checking for any_errors_fatal 7491 1727203983.32189: done checking for any_errors_fatal 7491 1727203983.32189: checking for max_fail_percentage 7491 1727203983.32191: done checking for max_fail_percentage 7491 1727203983.32192: checking to see if all hosts have failed and the running result is not ok 7491 1727203983.32193: done checking to see if all hosts have failed 7491 1727203983.32194: getting the remaining hosts for this loop 7491 1727203983.32195: done getting the remaining hosts for this loop 7491 1727203983.32199: getting the next task for host managed-node3 7491 1727203983.32206: done getting next task for host managed-node3 7491 1727203983.32209: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7491 1727203983.32213: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203983.32232: getting variables 7491 1727203983.32241: in VariableManager get_vars() 7491 1727203983.32288: Calling all_inventory to load vars for managed-node3 7491 1727203983.32291: Calling groups_inventory to load vars for managed-node3 7491 1727203983.32293: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203983.32302: Calling all_plugins_play to load vars for managed-node3 7491 1727203983.32305: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203983.32307: Calling groups_plugins_play to load vars for managed-node3 7491 1727203983.33278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203983.34210: done with get_vars() 7491 1727203983.34237: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:53:03 -0400 (0:00:00.033) 0:00:25.266 ***** 7491 1727203983.34308: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 7491 1727203983.34553: worker is 1 (out of 1 available) 7491 1727203983.34569: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 7491 1727203983.34583: done queuing things up, now waiting for results queue to drain 7491 1727203983.34585: waiting for pending results... 7491 1727203983.34777: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7491 1727203983.34872: in run() - task 0affcd87-79f5-0a4a-ad01-000000000079 7491 1727203983.34883: variable 'ansible_search_path' from source: unknown 7491 1727203983.34887: variable 'ansible_search_path' from source: unknown 7491 1727203983.34919: calling self._execute() 7491 1727203983.34999: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203983.35003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203983.35012: variable 'omit' from source: magic vars 7491 1727203983.35305: variable 'ansible_distribution_major_version' from source: facts 7491 1727203983.35315: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203983.35322: variable 'omit' from source: magic vars 7491 1727203983.35361: variable 'omit' from source: magic vars 7491 1727203983.35485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203983.37068: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203983.37121: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203983.37149: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203983.37179: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203983.37202: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203983.37266: variable 'network_provider' from source: set_fact 7491 1727203983.37363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203983.37397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203983.37414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203983.37446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203983.37458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203983.37512: variable 'omit' from source: magic vars 7491 1727203983.37597: variable 'omit' from source: magic vars 7491 1727203983.37673: variable 'network_connections' from source: task vars 7491 1727203983.37686: variable 'interface' from source: play vars 7491 1727203983.37732: variable 'interface' from source: play vars 7491 1727203983.37835: variable 'omit' from source: magic vars 7491 1727203983.37843: variable '__lsr_ansible_managed' from source: task vars 7491 1727203983.37891: variable '__lsr_ansible_managed' from source: task vars 7491 1727203983.38079: Loaded config def from plugin (lookup/template) 7491 1727203983.38082: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 7491 1727203983.38105: File lookup term: get_ansible_managed.j2 7491 1727203983.38113: variable 'ansible_search_path' from source: unknown 7491 1727203983.38119: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 7491 1727203983.38131: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 7491 1727203983.38144: variable 'ansible_search_path' from source: unknown 7491 1727203983.41702: variable 'ansible_managed' from source: unknown 7491 1727203983.41788: variable 'omit' from source: magic vars 7491 1727203983.41811: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203983.41834: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203983.41849: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203983.41862: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203983.41872: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203983.41894: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203983.41899: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203983.41902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203983.41967: Set connection var ansible_timeout to 10 7491 1727203983.41972: Set connection var ansible_pipelining to False 7491 1727203983.41978: Set connection var ansible_shell_type to sh 7491 1727203983.41983: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203983.41989: Set connection var ansible_shell_executable to /bin/sh 7491 1727203983.41998: Set connection var ansible_connection to ssh 7491 1727203983.42016: variable 'ansible_shell_executable' from source: unknown 7491 1727203983.42019: variable 'ansible_connection' from source: unknown 7491 1727203983.42021: variable 'ansible_module_compression' from source: unknown 7491 1727203983.42023: variable 'ansible_shell_type' from source: unknown 7491 1727203983.42025: variable 'ansible_shell_executable' from source: unknown 7491 1727203983.42030: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203983.42033: variable 'ansible_pipelining' from source: unknown 7491 1727203983.42036: variable 'ansible_timeout' from source: unknown 7491 1727203983.42040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203983.42136: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7491 1727203983.42148: variable 'omit' from source: magic vars 7491 1727203983.42151: starting attempt loop 7491 1727203983.42154: running the handler 7491 1727203983.42166: _low_level_execute_command(): starting 7491 1727203983.42172: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203983.42692: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203983.42709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203983.42725: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203983.42737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203983.42791: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203983.42801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203983.42869: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203983.44482: stdout chunk (state=3): >>>/root <<< 7491 1727203983.44570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203983.44634: stderr chunk (state=3): >>><<< 7491 1727203983.44637: stdout chunk (state=3): >>><<< 7491 1727203983.44657: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203983.44669: _low_level_execute_command(): starting 7491 1727203983.44675: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203983.4465814-8794-16378506023233 `" && echo ansible-tmp-1727203983.4465814-8794-16378506023233="` echo /root/.ansible/tmp/ansible-tmp-1727203983.4465814-8794-16378506023233 `" ) && sleep 0' 7491 1727203983.45161: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203983.45178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203983.45197: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203983.45210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203983.45220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203983.45272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203983.45285: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203983.45332: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203983.47138: stdout chunk (state=3): >>>ansible-tmp-1727203983.4465814-8794-16378506023233=/root/.ansible/tmp/ansible-tmp-1727203983.4465814-8794-16378506023233 <<< 7491 1727203983.47281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203983.47306: stderr chunk (state=3): >>><<< 7491 1727203983.47310: stdout chunk (state=3): >>><<< 7491 1727203983.47327: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203983.4465814-8794-16378506023233=/root/.ansible/tmp/ansible-tmp-1727203983.4465814-8794-16378506023233 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203983.47369: variable 'ansible_module_compression' from source: unknown 7491 1727203983.47411: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 7491 1727203983.47455: variable 'ansible_facts' from source: unknown 7491 1727203983.47547: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203983.4465814-8794-16378506023233/AnsiballZ_network_connections.py 7491 1727203983.47662: Sending initial data 7491 1727203983.47675: Sent initial data (165 bytes) 7491 1727203983.48382: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203983.48387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203983.48441: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203983.48449: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203983.48451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203983.48453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203983.48501: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203983.48505: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203983.48567: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203983.50250: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203983.50288: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203983.50331: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpemxitv7m /root/.ansible/tmp/ansible-tmp-1727203983.4465814-8794-16378506023233/AnsiballZ_network_connections.py <<< 7491 1727203983.50365: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203983.51493: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203983.51606: stderr chunk (state=3): >>><<< 7491 1727203983.51610: stdout chunk (state=3): >>><<< 7491 1727203983.51629: done transferring module to remote 7491 1727203983.51639: _low_level_execute_command(): starting 7491 1727203983.51642: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203983.4465814-8794-16378506023233/ /root/.ansible/tmp/ansible-tmp-1727203983.4465814-8794-16378506023233/AnsiballZ_network_connections.py && sleep 0' 7491 1727203983.52127: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203983.52131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203983.52157: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203983.52171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203983.52222: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203983.52228: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203983.52236: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203983.52294: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203983.53975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203983.54042: stderr chunk (state=3): >>><<< 7491 1727203983.54046: stdout chunk (state=3): >>><<< 7491 1727203983.54060: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203983.54063: _low_level_execute_command(): starting 7491 1727203983.54073: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203983.4465814-8794-16378506023233/AnsiballZ_network_connections.py && sleep 0' 7491 1727203983.54558: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203983.54563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203983.54596: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203983.54608: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203983.54662: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203983.54678: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203983.54736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203983.82882: stdout chunk (state=3): >>>Traceback (most recent call last):<<< 7491 1727203983.82889: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_zibe1dv_/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_zibe1dv_/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail <<< 7491 1727203983.82898: stdout chunk (state=3): >>>ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/e01b0787-1873-4334-a8a8-27f8e63061d2: error=unknown <<< 7491 1727203983.83047: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 7491 1727203983.84522: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203983.84585: stderr chunk (state=3): >>><<< 7491 1727203983.84588: stdout chunk (state=3): >>><<< 7491 1727203983.84606: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_zibe1dv_/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_zibe1dv_/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/e01b0787-1873-4334-a8a8-27f8e63061d2: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203983.84640: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203983.4465814-8794-16378506023233/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203983.84648: _low_level_execute_command(): starting 7491 1727203983.84653: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203983.4465814-8794-16378506023233/ > /dev/null 2>&1 && sleep 0' 7491 1727203983.85128: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203983.85133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203983.85172: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203983.85185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 7491 1727203983.85197: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203983.85243: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203983.85255: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203983.85306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203983.87076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203983.87137: stderr chunk (state=3): >>><<< 7491 1727203983.87140: stdout chunk (state=3): >>><<< 7491 1727203983.87155: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203983.87161: handler run complete 7491 1727203983.87184: attempt loop complete, returning result 7491 1727203983.87187: _execute() done 7491 1727203983.87196: dumping result to json 7491 1727203983.87199: done dumping result, returning 7491 1727203983.87203: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-0a4a-ad01-000000000079] 7491 1727203983.87210: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000079 7491 1727203983.87313: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000079 7491 1727203983.87316: WORKER PROCESS EXITING changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 7491 1727203983.87412: no more pending results, returning what we have 7491 1727203983.87415: results queue empty 7491 1727203983.87416: checking for any_errors_fatal 7491 1727203983.87423: done checking for any_errors_fatal 7491 1727203983.87424: checking for max_fail_percentage 7491 1727203983.87425: done checking for max_fail_percentage 7491 1727203983.87426: checking to see if all hosts have failed and the running result is not ok 7491 1727203983.87427: done checking to see if all hosts have failed 7491 1727203983.87428: getting the remaining hosts for this loop 7491 1727203983.87429: done getting the remaining hosts for this loop 7491 1727203983.87433: getting the next task for host managed-node3 7491 1727203983.87439: done getting next task for host managed-node3 7491 1727203983.87443: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 7491 1727203983.87445: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203983.87456: getting variables 7491 1727203983.87458: in VariableManager get_vars() 7491 1727203983.87507: Calling all_inventory to load vars for managed-node3 7491 1727203983.87510: Calling groups_inventory to load vars for managed-node3 7491 1727203983.87512: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203983.87521: Calling all_plugins_play to load vars for managed-node3 7491 1727203983.87524: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203983.87526: Calling groups_plugins_play to load vars for managed-node3 7491 1727203983.88351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203983.89273: done with get_vars() 7491 1727203983.89297: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:53:03 -0400 (0:00:00.550) 0:00:25.817 ***** 7491 1727203983.89366: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 7491 1727203983.89607: worker is 1 (out of 1 available) 7491 1727203983.89621: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 7491 1727203983.89634: done queuing things up, now waiting for results queue to drain 7491 1727203983.89635: waiting for pending results... 7491 1727203983.89831: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 7491 1727203983.89940: in run() - task 0affcd87-79f5-0a4a-ad01-00000000007a 7491 1727203983.89952: variable 'ansible_search_path' from source: unknown 7491 1727203983.89955: variable 'ansible_search_path' from source: unknown 7491 1727203983.89989: calling self._execute() 7491 1727203983.90066: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203983.90071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203983.90080: variable 'omit' from source: magic vars 7491 1727203983.90369: variable 'ansible_distribution_major_version' from source: facts 7491 1727203983.90380: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203983.90469: variable 'network_state' from source: role '' defaults 7491 1727203983.90477: Evaluated conditional (network_state != {}): False 7491 1727203983.90480: when evaluation is False, skipping this task 7491 1727203983.90483: _execute() done 7491 1727203983.90485: dumping result to json 7491 1727203983.90488: done dumping result, returning 7491 1727203983.90494: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-0a4a-ad01-00000000007a] 7491 1727203983.90501: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000007a 7491 1727203983.90586: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000007a 7491 1727203983.90589: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7491 1727203983.90673: no more pending results, returning what we have 7491 1727203983.90677: results queue empty 7491 1727203983.90678: checking for any_errors_fatal 7491 1727203983.90688: done checking for any_errors_fatal 7491 1727203983.90688: checking for max_fail_percentage 7491 1727203983.90690: done checking for max_fail_percentage 7491 1727203983.90691: checking to see if all hosts have failed and the running result is not ok 7491 1727203983.90692: done checking to see if all hosts have failed 7491 1727203983.90693: getting the remaining hosts for this loop 7491 1727203983.90695: done getting the remaining hosts for this loop 7491 1727203983.90699: getting the next task for host managed-node3 7491 1727203983.90704: done getting next task for host managed-node3 7491 1727203983.90708: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7491 1727203983.90711: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203983.90727: getting variables 7491 1727203983.90729: in VariableManager get_vars() 7491 1727203983.90773: Calling all_inventory to load vars for managed-node3 7491 1727203983.90776: Calling groups_inventory to load vars for managed-node3 7491 1727203983.90778: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203983.90786: Calling all_plugins_play to load vars for managed-node3 7491 1727203983.90788: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203983.90791: Calling groups_plugins_play to load vars for managed-node3 7491 1727203983.91709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203983.92632: done with get_vars() 7491 1727203983.92653: done getting variables 7491 1727203983.92702: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:53:03 -0400 (0:00:00.033) 0:00:25.851 ***** 7491 1727203983.92729: entering _queue_task() for managed-node3/debug 7491 1727203983.92965: worker is 1 (out of 1 available) 7491 1727203983.92980: exiting _queue_task() for managed-node3/debug 7491 1727203983.92993: done queuing things up, now waiting for results queue to drain 7491 1727203983.92994: waiting for pending results... 7491 1727203983.93190: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7491 1727203983.93288: in run() - task 0affcd87-79f5-0a4a-ad01-00000000007b 7491 1727203983.93301: variable 'ansible_search_path' from source: unknown 7491 1727203983.93304: variable 'ansible_search_path' from source: unknown 7491 1727203983.93337: calling self._execute() 7491 1727203983.93414: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203983.93418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203983.93428: variable 'omit' from source: magic vars 7491 1727203983.93706: variable 'ansible_distribution_major_version' from source: facts 7491 1727203983.93716: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203983.93724: variable 'omit' from source: magic vars 7491 1727203983.93764: variable 'omit' from source: magic vars 7491 1727203983.93794: variable 'omit' from source: magic vars 7491 1727203983.93831: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203983.93858: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203983.93878: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203983.93892: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203983.93904: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203983.93930: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203983.93933: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203983.93935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203983.94008: Set connection var ansible_timeout to 10 7491 1727203983.94019: Set connection var ansible_pipelining to False 7491 1727203983.94022: Set connection var ansible_shell_type to sh 7491 1727203983.94026: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203983.94033: Set connection var ansible_shell_executable to /bin/sh 7491 1727203983.94038: Set connection var ansible_connection to ssh 7491 1727203983.94055: variable 'ansible_shell_executable' from source: unknown 7491 1727203983.94057: variable 'ansible_connection' from source: unknown 7491 1727203983.94060: variable 'ansible_module_compression' from source: unknown 7491 1727203983.94062: variable 'ansible_shell_type' from source: unknown 7491 1727203983.94067: variable 'ansible_shell_executable' from source: unknown 7491 1727203983.94069: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203983.94071: variable 'ansible_pipelining' from source: unknown 7491 1727203983.94074: variable 'ansible_timeout' from source: unknown 7491 1727203983.94078: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203983.94184: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203983.94194: variable 'omit' from source: magic vars 7491 1727203983.94197: starting attempt loop 7491 1727203983.94199: running the handler 7491 1727203983.94297: variable '__network_connections_result' from source: set_fact 7491 1727203983.94339: handler run complete 7491 1727203983.94354: attempt loop complete, returning result 7491 1727203983.94357: _execute() done 7491 1727203983.94359: dumping result to json 7491 1727203983.94362: done dumping result, returning 7491 1727203983.94371: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-0a4a-ad01-00000000007b] 7491 1727203983.94376: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000007b 7491 1727203983.94468: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000007b 7491 1727203983.94471: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "" ] } 7491 1727203983.94538: no more pending results, returning what we have 7491 1727203983.94541: results queue empty 7491 1727203983.94543: checking for any_errors_fatal 7491 1727203983.94552: done checking for any_errors_fatal 7491 1727203983.94553: checking for max_fail_percentage 7491 1727203983.94554: done checking for max_fail_percentage 7491 1727203983.94555: checking to see if all hosts have failed and the running result is not ok 7491 1727203983.94556: done checking to see if all hosts have failed 7491 1727203983.94557: getting the remaining hosts for this loop 7491 1727203983.94559: done getting the remaining hosts for this loop 7491 1727203983.94563: getting the next task for host managed-node3 7491 1727203983.94570: done getting next task for host managed-node3 7491 1727203983.94574: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7491 1727203983.94577: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203983.94587: getting variables 7491 1727203983.94589: in VariableManager get_vars() 7491 1727203983.94634: Calling all_inventory to load vars for managed-node3 7491 1727203983.94641: Calling groups_inventory to load vars for managed-node3 7491 1727203983.94643: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203983.94655: Calling all_plugins_play to load vars for managed-node3 7491 1727203983.94658: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203983.94661: Calling groups_plugins_play to load vars for managed-node3 7491 1727203983.95477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203983.96502: done with get_vars() 7491 1727203983.96519: done getting variables 7491 1727203983.96562: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:53:03 -0400 (0:00:00.038) 0:00:25.889 ***** 7491 1727203983.96589: entering _queue_task() for managed-node3/debug 7491 1727203983.96820: worker is 1 (out of 1 available) 7491 1727203983.96835: exiting _queue_task() for managed-node3/debug 7491 1727203983.96849: done queuing things up, now waiting for results queue to drain 7491 1727203983.96851: waiting for pending results... 7491 1727203983.97047: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7491 1727203983.97139: in run() - task 0affcd87-79f5-0a4a-ad01-00000000007c 7491 1727203983.97151: variable 'ansible_search_path' from source: unknown 7491 1727203983.97154: variable 'ansible_search_path' from source: unknown 7491 1727203983.97188: calling self._execute() 7491 1727203983.97274: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203983.97280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203983.97283: variable 'omit' from source: magic vars 7491 1727203983.97569: variable 'ansible_distribution_major_version' from source: facts 7491 1727203983.97580: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203983.97587: variable 'omit' from source: magic vars 7491 1727203983.97633: variable 'omit' from source: magic vars 7491 1727203983.97657: variable 'omit' from source: magic vars 7491 1727203983.97692: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203983.97722: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203983.97741: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203983.97754: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203983.97766: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203983.97790: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203983.97793: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203983.97795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203983.97869: Set connection var ansible_timeout to 10 7491 1727203983.97875: Set connection var ansible_pipelining to False 7491 1727203983.97880: Set connection var ansible_shell_type to sh 7491 1727203983.97885: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203983.97892: Set connection var ansible_shell_executable to /bin/sh 7491 1727203983.97896: Set connection var ansible_connection to ssh 7491 1727203983.97913: variable 'ansible_shell_executable' from source: unknown 7491 1727203983.97916: variable 'ansible_connection' from source: unknown 7491 1727203983.97922: variable 'ansible_module_compression' from source: unknown 7491 1727203983.97929: variable 'ansible_shell_type' from source: unknown 7491 1727203983.97932: variable 'ansible_shell_executable' from source: unknown 7491 1727203983.97934: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203983.97936: variable 'ansible_pipelining' from source: unknown 7491 1727203983.97941: variable 'ansible_timeout' from source: unknown 7491 1727203983.97943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203983.98048: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203983.98058: variable 'omit' from source: magic vars 7491 1727203983.98067: starting attempt loop 7491 1727203983.98070: running the handler 7491 1727203983.98105: variable '__network_connections_result' from source: set_fact 7491 1727203983.98167: variable '__network_connections_result' from source: set_fact 7491 1727203983.98240: handler run complete 7491 1727203983.98266: attempt loop complete, returning result 7491 1727203983.98269: _execute() done 7491 1727203983.98272: dumping result to json 7491 1727203983.98274: done dumping result, returning 7491 1727203983.98282: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-0a4a-ad01-00000000007c] 7491 1727203983.98287: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000007c 7491 1727203983.98384: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000007c 7491 1727203983.98387: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 7491 1727203983.98475: no more pending results, returning what we have 7491 1727203983.98479: results queue empty 7491 1727203983.98480: checking for any_errors_fatal 7491 1727203983.98487: done checking for any_errors_fatal 7491 1727203983.98488: checking for max_fail_percentage 7491 1727203983.98490: done checking for max_fail_percentage 7491 1727203983.98490: checking to see if all hosts have failed and the running result is not ok 7491 1727203983.98491: done checking to see if all hosts have failed 7491 1727203983.98492: getting the remaining hosts for this loop 7491 1727203983.98494: done getting the remaining hosts for this loop 7491 1727203983.98498: getting the next task for host managed-node3 7491 1727203983.98503: done getting next task for host managed-node3 7491 1727203983.98506: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7491 1727203983.98509: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203983.98523: getting variables 7491 1727203983.98525: in VariableManager get_vars() 7491 1727203983.98568: Calling all_inventory to load vars for managed-node3 7491 1727203983.98571: Calling groups_inventory to load vars for managed-node3 7491 1727203983.98573: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203983.98585: Calling all_plugins_play to load vars for managed-node3 7491 1727203983.98588: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203983.98590: Calling groups_plugins_play to load vars for managed-node3 7491 1727203983.99425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203984.00349: done with get_vars() 7491 1727203984.00370: done getting variables 7491 1727203984.00414: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:53:04 -0400 (0:00:00.038) 0:00:25.928 ***** 7491 1727203984.00448: entering _queue_task() for managed-node3/debug 7491 1727203984.00688: worker is 1 (out of 1 available) 7491 1727203984.00703: exiting _queue_task() for managed-node3/debug 7491 1727203984.00720: done queuing things up, now waiting for results queue to drain 7491 1727203984.00721: waiting for pending results... 7491 1727203984.00908: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7491 1727203984.00996: in run() - task 0affcd87-79f5-0a4a-ad01-00000000007d 7491 1727203984.01009: variable 'ansible_search_path' from source: unknown 7491 1727203984.01013: variable 'ansible_search_path' from source: unknown 7491 1727203984.01042: calling self._execute() 7491 1727203984.01119: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203984.01123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203984.01130: variable 'omit' from source: magic vars 7491 1727203984.01407: variable 'ansible_distribution_major_version' from source: facts 7491 1727203984.01421: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203984.01503: variable 'network_state' from source: role '' defaults 7491 1727203984.01514: Evaluated conditional (network_state != {}): False 7491 1727203984.01520: when evaluation is False, skipping this task 7491 1727203984.01523: _execute() done 7491 1727203984.01526: dumping result to json 7491 1727203984.01528: done dumping result, returning 7491 1727203984.01532: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-0a4a-ad01-00000000007d] 7491 1727203984.01538: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000007d 7491 1727203984.01628: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000007d 7491 1727203984.01630: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "network_state != {}" } 7491 1727203984.01675: no more pending results, returning what we have 7491 1727203984.01679: results queue empty 7491 1727203984.01680: checking for any_errors_fatal 7491 1727203984.01688: done checking for any_errors_fatal 7491 1727203984.01689: checking for max_fail_percentage 7491 1727203984.01690: done checking for max_fail_percentage 7491 1727203984.01691: checking to see if all hosts have failed and the running result is not ok 7491 1727203984.01692: done checking to see if all hosts have failed 7491 1727203984.01693: getting the remaining hosts for this loop 7491 1727203984.01695: done getting the remaining hosts for this loop 7491 1727203984.01698: getting the next task for host managed-node3 7491 1727203984.01704: done getting next task for host managed-node3 7491 1727203984.01709: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 7491 1727203984.01712: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203984.01731: getting variables 7491 1727203984.01733: in VariableManager get_vars() 7491 1727203984.01780: Calling all_inventory to load vars for managed-node3 7491 1727203984.01783: Calling groups_inventory to load vars for managed-node3 7491 1727203984.01784: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203984.01794: Calling all_plugins_play to load vars for managed-node3 7491 1727203984.01796: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203984.01799: Calling groups_plugins_play to load vars for managed-node3 7491 1727203984.02712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203984.03625: done with get_vars() 7491 1727203984.03648: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:53:04 -0400 (0:00:00.032) 0:00:25.961 ***** 7491 1727203984.03724: entering _queue_task() for managed-node3/ping 7491 1727203984.03959: worker is 1 (out of 1 available) 7491 1727203984.03974: exiting _queue_task() for managed-node3/ping 7491 1727203984.03989: done queuing things up, now waiting for results queue to drain 7491 1727203984.03990: waiting for pending results... 7491 1727203984.04190: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 7491 1727203984.04289: in run() - task 0affcd87-79f5-0a4a-ad01-00000000007e 7491 1727203984.04301: variable 'ansible_search_path' from source: unknown 7491 1727203984.04305: variable 'ansible_search_path' from source: unknown 7491 1727203984.04339: calling self._execute() 7491 1727203984.04419: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203984.04425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203984.04436: variable 'omit' from source: magic vars 7491 1727203984.04720: variable 'ansible_distribution_major_version' from source: facts 7491 1727203984.04734: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203984.04740: variable 'omit' from source: magic vars 7491 1727203984.04780: variable 'omit' from source: magic vars 7491 1727203984.04805: variable 'omit' from source: magic vars 7491 1727203984.04844: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203984.04873: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203984.04895: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203984.04908: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203984.04923: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203984.04946: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203984.04949: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203984.04952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203984.05028: Set connection var ansible_timeout to 10 7491 1727203984.05034: Set connection var ansible_pipelining to False 7491 1727203984.05039: Set connection var ansible_shell_type to sh 7491 1727203984.05045: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203984.05052: Set connection var ansible_shell_executable to /bin/sh 7491 1727203984.05056: Set connection var ansible_connection to ssh 7491 1727203984.05076: variable 'ansible_shell_executable' from source: unknown 7491 1727203984.05079: variable 'ansible_connection' from source: unknown 7491 1727203984.05082: variable 'ansible_module_compression' from source: unknown 7491 1727203984.05085: variable 'ansible_shell_type' from source: unknown 7491 1727203984.05088: variable 'ansible_shell_executable' from source: unknown 7491 1727203984.05090: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203984.05092: variable 'ansible_pipelining' from source: unknown 7491 1727203984.05094: variable 'ansible_timeout' from source: unknown 7491 1727203984.05099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203984.05251: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7491 1727203984.05260: variable 'omit' from source: magic vars 7491 1727203984.05267: starting attempt loop 7491 1727203984.05273: running the handler 7491 1727203984.05289: _low_level_execute_command(): starting 7491 1727203984.05296: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203984.05831: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203984.05846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203984.05864: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203984.05879: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203984.05931: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203984.05943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203984.05993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203984.07582: stdout chunk (state=3): >>>/root <<< 7491 1727203984.07679: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203984.07740: stderr chunk (state=3): >>><<< 7491 1727203984.07743: stdout chunk (state=3): >>><<< 7491 1727203984.07765: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203984.07778: _low_level_execute_command(): starting 7491 1727203984.07784: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203984.077662-8806-113501282679845 `" && echo ansible-tmp-1727203984.077662-8806-113501282679845="` echo /root/.ansible/tmp/ansible-tmp-1727203984.077662-8806-113501282679845 `" ) && sleep 0' 7491 1727203984.08254: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203984.08277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203984.08290: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203984.08302: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203984.08320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203984.08359: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203984.08378: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203984.08425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203984.10234: stdout chunk (state=3): >>>ansible-tmp-1727203984.077662-8806-113501282679845=/root/.ansible/tmp/ansible-tmp-1727203984.077662-8806-113501282679845 <<< 7491 1727203984.10338: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203984.10400: stderr chunk (state=3): >>><<< 7491 1727203984.10407: stdout chunk (state=3): >>><<< 7491 1727203984.10429: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203984.077662-8806-113501282679845=/root/.ansible/tmp/ansible-tmp-1727203984.077662-8806-113501282679845 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203984.10471: variable 'ansible_module_compression' from source: unknown 7491 1727203984.10507: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 7491 1727203984.10543: variable 'ansible_facts' from source: unknown 7491 1727203984.10597: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203984.077662-8806-113501282679845/AnsiballZ_ping.py 7491 1727203984.10706: Sending initial data 7491 1727203984.10716: Sent initial data (150 bytes) 7491 1727203984.11434: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203984.11438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203984.11470: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203984.11473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203984.11480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203984.11526: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203984.11538: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203984.11589: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203984.13254: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203984.13291: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203984.13333: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpw1r0qrgf /root/.ansible/tmp/ansible-tmp-1727203984.077662-8806-113501282679845/AnsiballZ_ping.py <<< 7491 1727203984.13367: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203984.14148: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203984.14266: stderr chunk (state=3): >>><<< 7491 1727203984.14270: stdout chunk (state=3): >>><<< 7491 1727203984.14290: done transferring module to remote 7491 1727203984.14299: _low_level_execute_command(): starting 7491 1727203984.14308: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203984.077662-8806-113501282679845/ /root/.ansible/tmp/ansible-tmp-1727203984.077662-8806-113501282679845/AnsiballZ_ping.py && sleep 0' 7491 1727203984.14790: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203984.14793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203984.14836: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203984.14839: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203984.14842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203984.14898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203984.14901: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203984.14908: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203984.14949: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203984.16618: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203984.16669: stderr chunk (state=3): >>><<< 7491 1727203984.16678: stdout chunk (state=3): >>><<< 7491 1727203984.16695: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203984.16703: _low_level_execute_command(): starting 7491 1727203984.16706: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203984.077662-8806-113501282679845/AnsiballZ_ping.py && sleep 0' 7491 1727203984.17167: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203984.17181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203984.17199: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203984.17222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203984.17267: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203984.17280: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203984.17338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203984.30108: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 7491 1727203984.31094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203984.31153: stderr chunk (state=3): >>><<< 7491 1727203984.31159: stdout chunk (state=3): >>><<< 7491 1727203984.31177: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203984.31200: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203984.077662-8806-113501282679845/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203984.31208: _low_level_execute_command(): starting 7491 1727203984.31213: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203984.077662-8806-113501282679845/ > /dev/null 2>&1 && sleep 0' 7491 1727203984.31701: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203984.31704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203984.31738: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203984.31742: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203984.31744: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203984.31787: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203984.31800: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203984.31849: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203984.33601: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203984.33662: stderr chunk (state=3): >>><<< 7491 1727203984.33667: stdout chunk (state=3): >>><<< 7491 1727203984.33682: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203984.33689: handler run complete 7491 1727203984.33702: attempt loop complete, returning result 7491 1727203984.33705: _execute() done 7491 1727203984.33707: dumping result to json 7491 1727203984.33710: done dumping result, returning 7491 1727203984.33720: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-0a4a-ad01-00000000007e] 7491 1727203984.33727: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000007e 7491 1727203984.33819: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000007e 7491 1727203984.33822: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 7491 1727203984.33920: no more pending results, returning what we have 7491 1727203984.33924: results queue empty 7491 1727203984.33925: checking for any_errors_fatal 7491 1727203984.33932: done checking for any_errors_fatal 7491 1727203984.33933: checking for max_fail_percentage 7491 1727203984.33934: done checking for max_fail_percentage 7491 1727203984.33935: checking to see if all hosts have failed and the running result is not ok 7491 1727203984.33936: done checking to see if all hosts have failed 7491 1727203984.33937: getting the remaining hosts for this loop 7491 1727203984.33939: done getting the remaining hosts for this loop 7491 1727203984.33943: getting the next task for host managed-node3 7491 1727203984.33952: done getting next task for host managed-node3 7491 1727203984.33954: ^ task is: TASK: meta (role_complete) 7491 1727203984.33957: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203984.33974: getting variables 7491 1727203984.33976: in VariableManager get_vars() 7491 1727203984.34023: Calling all_inventory to load vars for managed-node3 7491 1727203984.34026: Calling groups_inventory to load vars for managed-node3 7491 1727203984.34028: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203984.34037: Calling all_plugins_play to load vars for managed-node3 7491 1727203984.34039: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203984.34042: Calling groups_plugins_play to load vars for managed-node3 7491 1727203984.34855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203984.35777: done with get_vars() 7491 1727203984.35798: done getting variables 7491 1727203984.35861: done queuing things up, now waiting for results queue to drain 7491 1727203984.35863: results queue empty 7491 1727203984.35865: checking for any_errors_fatal 7491 1727203984.35867: done checking for any_errors_fatal 7491 1727203984.35868: checking for max_fail_percentage 7491 1727203984.35869: done checking for max_fail_percentage 7491 1727203984.35869: checking to see if all hosts have failed and the running result is not ok 7491 1727203984.35870: done checking to see if all hosts have failed 7491 1727203984.35870: getting the remaining hosts for this loop 7491 1727203984.35871: done getting the remaining hosts for this loop 7491 1727203984.35873: getting the next task for host managed-node3 7491 1727203984.35876: done getting next task for host managed-node3 7491 1727203984.35878: ^ task is: TASK: Include the task 'manage_test_interface.yml' 7491 1727203984.35879: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203984.35881: getting variables 7491 1727203984.35881: in VariableManager get_vars() 7491 1727203984.35895: Calling all_inventory to load vars for managed-node3 7491 1727203984.35896: Calling groups_inventory to load vars for managed-node3 7491 1727203984.35898: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203984.35902: Calling all_plugins_play to load vars for managed-node3 7491 1727203984.35904: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203984.35905: Calling groups_plugins_play to load vars for managed-node3 7491 1727203984.40580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203984.41480: done with get_vars() 7491 1727203984.41501: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:79 Tuesday 24 September 2024 14:53:04 -0400 (0:00:00.378) 0:00:26.339 ***** 7491 1727203984.41558: entering _queue_task() for managed-node3/include_tasks 7491 1727203984.41822: worker is 1 (out of 1 available) 7491 1727203984.41836: exiting _queue_task() for managed-node3/include_tasks 7491 1727203984.41849: done queuing things up, now waiting for results queue to drain 7491 1727203984.41851: waiting for pending results... 7491 1727203984.42048: running TaskExecutor() for managed-node3/TASK: Include the task 'manage_test_interface.yml' 7491 1727203984.42127: in run() - task 0affcd87-79f5-0a4a-ad01-0000000000ae 7491 1727203984.42138: variable 'ansible_search_path' from source: unknown 7491 1727203984.42172: calling self._execute() 7491 1727203984.42253: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203984.42258: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203984.42267: variable 'omit' from source: magic vars 7491 1727203984.42558: variable 'ansible_distribution_major_version' from source: facts 7491 1727203984.42569: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203984.42576: _execute() done 7491 1727203984.42580: dumping result to json 7491 1727203984.42584: done dumping result, returning 7491 1727203984.42589: done running TaskExecutor() for managed-node3/TASK: Include the task 'manage_test_interface.yml' [0affcd87-79f5-0a4a-ad01-0000000000ae] 7491 1727203984.42595: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000ae 7491 1727203984.42697: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000ae 7491 1727203984.42700: WORKER PROCESS EXITING 7491 1727203984.42730: no more pending results, returning what we have 7491 1727203984.42736: in VariableManager get_vars() 7491 1727203984.42794: Calling all_inventory to load vars for managed-node3 7491 1727203984.42798: Calling groups_inventory to load vars for managed-node3 7491 1727203984.42800: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203984.42819: Calling all_plugins_play to load vars for managed-node3 7491 1727203984.42823: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203984.42826: Calling groups_plugins_play to load vars for managed-node3 7491 1727203984.43637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203984.44562: done with get_vars() 7491 1727203984.44580: variable 'ansible_search_path' from source: unknown 7491 1727203984.44593: we have included files to process 7491 1727203984.44594: generating all_blocks data 7491 1727203984.44596: done generating all_blocks data 7491 1727203984.44600: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7491 1727203984.44601: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7491 1727203984.44602: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7491 1727203984.44870: in VariableManager get_vars() 7491 1727203984.44893: done with get_vars() 7491 1727203984.45326: done processing included file 7491 1727203984.45328: iterating over new_blocks loaded from include file 7491 1727203984.45329: in VariableManager get_vars() 7491 1727203984.45346: done with get_vars() 7491 1727203984.45347: filtering new block on tags 7491 1727203984.45370: done filtering new block on tags 7491 1727203984.45372: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node3 7491 1727203984.45376: extending task lists for all hosts with included blocks 7491 1727203984.48162: done extending task lists 7491 1727203984.48165: done processing included files 7491 1727203984.48166: results queue empty 7491 1727203984.48166: checking for any_errors_fatal 7491 1727203984.48167: done checking for any_errors_fatal 7491 1727203984.48168: checking for max_fail_percentage 7491 1727203984.48169: done checking for max_fail_percentage 7491 1727203984.48169: checking to see if all hosts have failed and the running result is not ok 7491 1727203984.48170: done checking to see if all hosts have failed 7491 1727203984.48170: getting the remaining hosts for this loop 7491 1727203984.48172: done getting the remaining hosts for this loop 7491 1727203984.48173: getting the next task for host managed-node3 7491 1727203984.48176: done getting next task for host managed-node3 7491 1727203984.48178: ^ task is: TASK: Ensure state in ["present", "absent"] 7491 1727203984.48179: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203984.48181: getting variables 7491 1727203984.48182: in VariableManager get_vars() 7491 1727203984.48198: Calling all_inventory to load vars for managed-node3 7491 1727203984.48200: Calling groups_inventory to load vars for managed-node3 7491 1727203984.48201: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203984.48206: Calling all_plugins_play to load vars for managed-node3 7491 1727203984.48207: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203984.48209: Calling groups_plugins_play to load vars for managed-node3 7491 1727203984.48903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203984.49829: done with get_vars() 7491 1727203984.49850: done getting variables 7491 1727203984.49888: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 14:53:04 -0400 (0:00:00.083) 0:00:26.423 ***** 7491 1727203984.49910: entering _queue_task() for managed-node3/fail 7491 1727203984.50159: worker is 1 (out of 1 available) 7491 1727203984.50175: exiting _queue_task() for managed-node3/fail 7491 1727203984.50188: done queuing things up, now waiting for results queue to drain 7491 1727203984.50189: waiting for pending results... 7491 1727203984.50375: running TaskExecutor() for managed-node3/TASK: Ensure state in ["present", "absent"] 7491 1727203984.50444: in run() - task 0affcd87-79f5-0a4a-ad01-000000000dff 7491 1727203984.50456: variable 'ansible_search_path' from source: unknown 7491 1727203984.50459: variable 'ansible_search_path' from source: unknown 7491 1727203984.50491: calling self._execute() 7491 1727203984.50576: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203984.50580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203984.50588: variable 'omit' from source: magic vars 7491 1727203984.50880: variable 'ansible_distribution_major_version' from source: facts 7491 1727203984.50890: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203984.50990: variable 'state' from source: include params 7491 1727203984.50994: Evaluated conditional (state not in ["present", "absent"]): False 7491 1727203984.50996: when evaluation is False, skipping this task 7491 1727203984.50999: _execute() done 7491 1727203984.51002: dumping result to json 7491 1727203984.51007: done dumping result, returning 7491 1727203984.51012: done running TaskExecutor() for managed-node3/TASK: Ensure state in ["present", "absent"] [0affcd87-79f5-0a4a-ad01-000000000dff] 7491 1727203984.51021: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000dff 7491 1727203984.51113: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000dff 7491 1727203984.51116: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 7491 1727203984.51162: no more pending results, returning what we have 7491 1727203984.51168: results queue empty 7491 1727203984.51169: checking for any_errors_fatal 7491 1727203984.51171: done checking for any_errors_fatal 7491 1727203984.51171: checking for max_fail_percentage 7491 1727203984.51173: done checking for max_fail_percentage 7491 1727203984.51174: checking to see if all hosts have failed and the running result is not ok 7491 1727203984.51180: done checking to see if all hosts have failed 7491 1727203984.51181: getting the remaining hosts for this loop 7491 1727203984.51183: done getting the remaining hosts for this loop 7491 1727203984.51187: getting the next task for host managed-node3 7491 1727203984.51193: done getting next task for host managed-node3 7491 1727203984.51196: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 7491 1727203984.51199: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203984.51203: getting variables 7491 1727203984.51204: in VariableManager get_vars() 7491 1727203984.51254: Calling all_inventory to load vars for managed-node3 7491 1727203984.51257: Calling groups_inventory to load vars for managed-node3 7491 1727203984.51259: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203984.51271: Calling all_plugins_play to load vars for managed-node3 7491 1727203984.51274: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203984.51276: Calling groups_plugins_play to load vars for managed-node3 7491 1727203984.52190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203984.53131: done with get_vars() 7491 1727203984.53151: done getting variables 7491 1727203984.53198: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 14:53:04 -0400 (0:00:00.033) 0:00:26.456 ***** 7491 1727203984.53223: entering _queue_task() for managed-node3/fail 7491 1727203984.53465: worker is 1 (out of 1 available) 7491 1727203984.53480: exiting _queue_task() for managed-node3/fail 7491 1727203984.53492: done queuing things up, now waiting for results queue to drain 7491 1727203984.53494: waiting for pending results... 7491 1727203984.53683: running TaskExecutor() for managed-node3/TASK: Ensure type in ["dummy", "tap", "veth"] 7491 1727203984.53746: in run() - task 0affcd87-79f5-0a4a-ad01-000000000e00 7491 1727203984.53759: variable 'ansible_search_path' from source: unknown 7491 1727203984.53765: variable 'ansible_search_path' from source: unknown 7491 1727203984.53795: calling self._execute() 7491 1727203984.53879: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203984.53883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203984.53892: variable 'omit' from source: magic vars 7491 1727203984.54188: variable 'ansible_distribution_major_version' from source: facts 7491 1727203984.54198: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203984.54303: variable 'type' from source: play vars 7491 1727203984.54307: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 7491 1727203984.54310: when evaluation is False, skipping this task 7491 1727203984.54313: _execute() done 7491 1727203984.54315: dumping result to json 7491 1727203984.54321: done dumping result, returning 7491 1727203984.54326: done running TaskExecutor() for managed-node3/TASK: Ensure type in ["dummy", "tap", "veth"] [0affcd87-79f5-0a4a-ad01-000000000e00] 7491 1727203984.54332: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000e00 7491 1727203984.54426: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000e00 7491 1727203984.54429: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 7491 1727203984.54482: no more pending results, returning what we have 7491 1727203984.54486: results queue empty 7491 1727203984.54487: checking for any_errors_fatal 7491 1727203984.54494: done checking for any_errors_fatal 7491 1727203984.54495: checking for max_fail_percentage 7491 1727203984.54496: done checking for max_fail_percentage 7491 1727203984.54497: checking to see if all hosts have failed and the running result is not ok 7491 1727203984.54498: done checking to see if all hosts have failed 7491 1727203984.54499: getting the remaining hosts for this loop 7491 1727203984.54501: done getting the remaining hosts for this loop 7491 1727203984.54505: getting the next task for host managed-node3 7491 1727203984.54511: done getting next task for host managed-node3 7491 1727203984.54514: ^ task is: TASK: Include the task 'show_interfaces.yml' 7491 1727203984.54519: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203984.54524: getting variables 7491 1727203984.54525: in VariableManager get_vars() 7491 1727203984.54581: Calling all_inventory to load vars for managed-node3 7491 1727203984.54584: Calling groups_inventory to load vars for managed-node3 7491 1727203984.54586: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203984.54596: Calling all_plugins_play to load vars for managed-node3 7491 1727203984.54598: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203984.54601: Calling groups_plugins_play to load vars for managed-node3 7491 1727203984.55421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203984.56376: done with get_vars() 7491 1727203984.56401: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 14:53:04 -0400 (0:00:00.032) 0:00:26.488 ***** 7491 1727203984.56478: entering _queue_task() for managed-node3/include_tasks 7491 1727203984.56727: worker is 1 (out of 1 available) 7491 1727203984.56741: exiting _queue_task() for managed-node3/include_tasks 7491 1727203984.56754: done queuing things up, now waiting for results queue to drain 7491 1727203984.56756: waiting for pending results... 7491 1727203984.56942: running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' 7491 1727203984.57019: in run() - task 0affcd87-79f5-0a4a-ad01-000000000e01 7491 1727203984.57030: variable 'ansible_search_path' from source: unknown 7491 1727203984.57033: variable 'ansible_search_path' from source: unknown 7491 1727203984.57070: calling self._execute() 7491 1727203984.57146: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203984.57150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203984.57159: variable 'omit' from source: magic vars 7491 1727203984.57455: variable 'ansible_distribution_major_version' from source: facts 7491 1727203984.57468: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203984.57473: _execute() done 7491 1727203984.57476: dumping result to json 7491 1727203984.57479: done dumping result, returning 7491 1727203984.57485: done running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' [0affcd87-79f5-0a4a-ad01-000000000e01] 7491 1727203984.57491: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000e01 7491 1727203984.57585: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000e01 7491 1727203984.57588: WORKER PROCESS EXITING 7491 1727203984.57624: no more pending results, returning what we have 7491 1727203984.57629: in VariableManager get_vars() 7491 1727203984.57688: Calling all_inventory to load vars for managed-node3 7491 1727203984.57691: Calling groups_inventory to load vars for managed-node3 7491 1727203984.57694: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203984.57707: Calling all_plugins_play to load vars for managed-node3 7491 1727203984.57714: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203984.57723: Calling groups_plugins_play to load vars for managed-node3 7491 1727203984.58663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203984.59593: done with get_vars() 7491 1727203984.59612: variable 'ansible_search_path' from source: unknown 7491 1727203984.59613: variable 'ansible_search_path' from source: unknown 7491 1727203984.59644: we have included files to process 7491 1727203984.59645: generating all_blocks data 7491 1727203984.59646: done generating all_blocks data 7491 1727203984.59650: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7491 1727203984.59651: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7491 1727203984.59652: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7491 1727203984.59736: in VariableManager get_vars() 7491 1727203984.59756: done with get_vars() 7491 1727203984.59840: done processing included file 7491 1727203984.59841: iterating over new_blocks loaded from include file 7491 1727203984.59842: in VariableManager get_vars() 7491 1727203984.59859: done with get_vars() 7491 1727203984.59860: filtering new block on tags 7491 1727203984.59873: done filtering new block on tags 7491 1727203984.59875: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node3 7491 1727203984.59880: extending task lists for all hosts with included blocks 7491 1727203984.60120: done extending task lists 7491 1727203984.60122: done processing included files 7491 1727203984.60122: results queue empty 7491 1727203984.60123: checking for any_errors_fatal 7491 1727203984.60125: done checking for any_errors_fatal 7491 1727203984.60125: checking for max_fail_percentage 7491 1727203984.60126: done checking for max_fail_percentage 7491 1727203984.60127: checking to see if all hosts have failed and the running result is not ok 7491 1727203984.60127: done checking to see if all hosts have failed 7491 1727203984.60128: getting the remaining hosts for this loop 7491 1727203984.60129: done getting the remaining hosts for this loop 7491 1727203984.60130: getting the next task for host managed-node3 7491 1727203984.60133: done getting next task for host managed-node3 7491 1727203984.60134: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 7491 1727203984.60136: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203984.60138: getting variables 7491 1727203984.60139: in VariableManager get_vars() 7491 1727203984.60151: Calling all_inventory to load vars for managed-node3 7491 1727203984.60153: Calling groups_inventory to load vars for managed-node3 7491 1727203984.60154: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203984.60158: Calling all_plugins_play to load vars for managed-node3 7491 1727203984.60160: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203984.60161: Calling groups_plugins_play to load vars for managed-node3 7491 1727203984.60911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203984.61822: done with get_vars() 7491 1727203984.61842: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:53:04 -0400 (0:00:00.054) 0:00:26.542 ***** 7491 1727203984.61901: entering _queue_task() for managed-node3/include_tasks 7491 1727203984.62148: worker is 1 (out of 1 available) 7491 1727203984.62161: exiting _queue_task() for managed-node3/include_tasks 7491 1727203984.62176: done queuing things up, now waiting for results queue to drain 7491 1727203984.62177: waiting for pending results... 7491 1727203984.62369: running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' 7491 1727203984.62445: in run() - task 0affcd87-79f5-0a4a-ad01-000000001030 7491 1727203984.62456: variable 'ansible_search_path' from source: unknown 7491 1727203984.62459: variable 'ansible_search_path' from source: unknown 7491 1727203984.62492: calling self._execute() 7491 1727203984.62571: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203984.62574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203984.62585: variable 'omit' from source: magic vars 7491 1727203984.62875: variable 'ansible_distribution_major_version' from source: facts 7491 1727203984.62885: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203984.62891: _execute() done 7491 1727203984.62894: dumping result to json 7491 1727203984.62897: done dumping result, returning 7491 1727203984.62905: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' [0affcd87-79f5-0a4a-ad01-000000001030] 7491 1727203984.62910: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001030 7491 1727203984.63005: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001030 7491 1727203984.63007: WORKER PROCESS EXITING 7491 1727203984.63043: no more pending results, returning what we have 7491 1727203984.63048: in VariableManager get_vars() 7491 1727203984.63107: Calling all_inventory to load vars for managed-node3 7491 1727203984.63110: Calling groups_inventory to load vars for managed-node3 7491 1727203984.63113: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203984.63128: Calling all_plugins_play to load vars for managed-node3 7491 1727203984.63136: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203984.63139: Calling groups_plugins_play to load vars for managed-node3 7491 1727203984.63966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203984.64901: done with get_vars() 7491 1727203984.64915: variable 'ansible_search_path' from source: unknown 7491 1727203984.64918: variable 'ansible_search_path' from source: unknown 7491 1727203984.64958: we have included files to process 7491 1727203984.64959: generating all_blocks data 7491 1727203984.64960: done generating all_blocks data 7491 1727203984.64961: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7491 1727203984.64962: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7491 1727203984.64963: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7491 1727203984.65153: done processing included file 7491 1727203984.65154: iterating over new_blocks loaded from include file 7491 1727203984.65155: in VariableManager get_vars() 7491 1727203984.65174: done with get_vars() 7491 1727203984.65175: filtering new block on tags 7491 1727203984.65188: done filtering new block on tags 7491 1727203984.65190: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node3 7491 1727203984.65194: extending task lists for all hosts with included blocks 7491 1727203984.65288: done extending task lists 7491 1727203984.65289: done processing included files 7491 1727203984.65290: results queue empty 7491 1727203984.65290: checking for any_errors_fatal 7491 1727203984.65294: done checking for any_errors_fatal 7491 1727203984.65295: checking for max_fail_percentage 7491 1727203984.65295: done checking for max_fail_percentage 7491 1727203984.65296: checking to see if all hosts have failed and the running result is not ok 7491 1727203984.65296: done checking to see if all hosts have failed 7491 1727203984.65297: getting the remaining hosts for this loop 7491 1727203984.65298: done getting the remaining hosts for this loop 7491 1727203984.65299: getting the next task for host managed-node3 7491 1727203984.65302: done getting next task for host managed-node3 7491 1727203984.65304: ^ task is: TASK: Gather current interface info 7491 1727203984.65307: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203984.65309: getting variables 7491 1727203984.65309: in VariableManager get_vars() 7491 1727203984.65323: Calling all_inventory to load vars for managed-node3 7491 1727203984.65324: Calling groups_inventory to load vars for managed-node3 7491 1727203984.65326: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203984.65330: Calling all_plugins_play to load vars for managed-node3 7491 1727203984.65331: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203984.65333: Calling groups_plugins_play to load vars for managed-node3 7491 1727203984.66070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203984.66978: done with get_vars() 7491 1727203984.66994: done getting variables 7491 1727203984.67026: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:53:04 -0400 (0:00:00.051) 0:00:26.594 ***** 7491 1727203984.67050: entering _queue_task() for managed-node3/command 7491 1727203984.67290: worker is 1 (out of 1 available) 7491 1727203984.67303: exiting _queue_task() for managed-node3/command 7491 1727203984.67321: done queuing things up, now waiting for results queue to drain 7491 1727203984.67322: waiting for pending results... 7491 1727203984.67504: running TaskExecutor() for managed-node3/TASK: Gather current interface info 7491 1727203984.67584: in run() - task 0affcd87-79f5-0a4a-ad01-000000001067 7491 1727203984.67592: variable 'ansible_search_path' from source: unknown 7491 1727203984.67596: variable 'ansible_search_path' from source: unknown 7491 1727203984.67625: calling self._execute() 7491 1727203984.67706: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203984.67710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203984.67721: variable 'omit' from source: magic vars 7491 1727203984.68015: variable 'ansible_distribution_major_version' from source: facts 7491 1727203984.68028: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203984.68033: variable 'omit' from source: magic vars 7491 1727203984.68071: variable 'omit' from source: magic vars 7491 1727203984.68095: variable 'omit' from source: magic vars 7491 1727203984.68135: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203984.68162: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203984.68181: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203984.68194: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203984.68206: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203984.68234: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203984.68237: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203984.68239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203984.68310: Set connection var ansible_timeout to 10 7491 1727203984.68316: Set connection var ansible_pipelining to False 7491 1727203984.68323: Set connection var ansible_shell_type to sh 7491 1727203984.68329: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203984.68339: Set connection var ansible_shell_executable to /bin/sh 7491 1727203984.68343: Set connection var ansible_connection to ssh 7491 1727203984.68366: variable 'ansible_shell_executable' from source: unknown 7491 1727203984.68370: variable 'ansible_connection' from source: unknown 7491 1727203984.68373: variable 'ansible_module_compression' from source: unknown 7491 1727203984.68375: variable 'ansible_shell_type' from source: unknown 7491 1727203984.68378: variable 'ansible_shell_executable' from source: unknown 7491 1727203984.68380: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203984.68382: variable 'ansible_pipelining' from source: unknown 7491 1727203984.68384: variable 'ansible_timeout' from source: unknown 7491 1727203984.68389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203984.68496: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203984.68504: variable 'omit' from source: magic vars 7491 1727203984.68509: starting attempt loop 7491 1727203984.68512: running the handler 7491 1727203984.68527: _low_level_execute_command(): starting 7491 1727203984.68535: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203984.69071: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203984.69090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203984.69112: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7491 1727203984.69129: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203984.69172: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203984.69185: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203984.69195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203984.69252: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203984.70859: stdout chunk (state=3): >>>/root <<< 7491 1727203984.70959: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203984.71014: stderr chunk (state=3): >>><<< 7491 1727203984.71017: stdout chunk (state=3): >>><<< 7491 1727203984.71042: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203984.71056: _low_level_execute_command(): starting 7491 1727203984.71062: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203984.7104545-8826-34881409399890 `" && echo ansible-tmp-1727203984.7104545-8826-34881409399890="` echo /root/.ansible/tmp/ansible-tmp-1727203984.7104545-8826-34881409399890 `" ) && sleep 0' 7491 1727203984.71520: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203984.71533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203984.71555: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203984.71575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203984.71599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203984.71628: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203984.71640: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203984.71697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203984.73478: stdout chunk (state=3): >>>ansible-tmp-1727203984.7104545-8826-34881409399890=/root/.ansible/tmp/ansible-tmp-1727203984.7104545-8826-34881409399890 <<< 7491 1727203984.73586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203984.73644: stderr chunk (state=3): >>><<< 7491 1727203984.73648: stdout chunk (state=3): >>><<< 7491 1727203984.73666: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203984.7104545-8826-34881409399890=/root/.ansible/tmp/ansible-tmp-1727203984.7104545-8826-34881409399890 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203984.73692: variable 'ansible_module_compression' from source: unknown 7491 1727203984.73744: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7491 1727203984.73776: variable 'ansible_facts' from source: unknown 7491 1727203984.73841: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203984.7104545-8826-34881409399890/AnsiballZ_command.py 7491 1727203984.73953: Sending initial data 7491 1727203984.73962: Sent initial data (153 bytes) 7491 1727203984.74644: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203984.74648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203984.74684: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203984.74687: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203984.74690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203984.74747: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203984.74750: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203984.74796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203984.76448: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203984.76485: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203984.76530: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmphgz83zo5 /root/.ansible/tmp/ansible-tmp-1727203984.7104545-8826-34881409399890/AnsiballZ_command.py <<< 7491 1727203984.76568: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203984.77355: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203984.77469: stderr chunk (state=3): >>><<< 7491 1727203984.77472: stdout chunk (state=3): >>><<< 7491 1727203984.77491: done transferring module to remote 7491 1727203984.77501: _low_level_execute_command(): starting 7491 1727203984.77507: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203984.7104545-8826-34881409399890/ /root/.ansible/tmp/ansible-tmp-1727203984.7104545-8826-34881409399890/AnsiballZ_command.py && sleep 0' 7491 1727203984.77962: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203984.77977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203984.77996: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203984.78008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203984.78018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203984.78067: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203984.78080: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203984.78127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203984.79785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203984.79833: stderr chunk (state=3): >>><<< 7491 1727203984.79836: stdout chunk (state=3): >>><<< 7491 1727203984.79853: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203984.79863: _low_level_execute_command(): starting 7491 1727203984.79872: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203984.7104545-8826-34881409399890/AnsiballZ_command.py && sleep 0' 7491 1727203984.80323: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203984.80336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203984.80358: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203984.80372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203984.80381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203984.80425: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203984.80437: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203984.80496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203984.93824: stdout chunk (state=3): >>> {"changed": true, "stdout": "eth0\nlo\npeerveth0\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:53:04.934387", "end": "2024-09-24 14:53:04.937415", "delta": "0:00:00.003028", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7491 1727203984.94899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203984.94961: stderr chunk (state=3): >>><<< 7491 1727203984.94967: stdout chunk (state=3): >>><<< 7491 1727203984.94983: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "eth0\nlo\npeerveth0\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:53:04.934387", "end": "2024-09-24 14:53:04.937415", "delta": "0:00:00.003028", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203984.95014: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203984.7104545-8826-34881409399890/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203984.95027: _low_level_execute_command(): starting 7491 1727203984.95030: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203984.7104545-8826-34881409399890/ > /dev/null 2>&1 && sleep 0' 7491 1727203984.95509: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203984.95522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203984.95544: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203984.95555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203984.95567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203984.95641: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203984.95697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203984.97450: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203984.97505: stderr chunk (state=3): >>><<< 7491 1727203984.97508: stdout chunk (state=3): >>><<< 7491 1727203984.97531: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203984.97541: handler run complete 7491 1727203984.97565: Evaluated conditional (False): False 7491 1727203984.97576: attempt loop complete, returning result 7491 1727203984.97579: _execute() done 7491 1727203984.97581: dumping result to json 7491 1727203984.97586: done dumping result, returning 7491 1727203984.97593: done running TaskExecutor() for managed-node3/TASK: Gather current interface info [0affcd87-79f5-0a4a-ad01-000000001067] 7491 1727203984.97600: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001067 7491 1727203984.97700: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001067 7491 1727203984.97703: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003028", "end": "2024-09-24 14:53:04.937415", "rc": 0, "start": "2024-09-24 14:53:04.934387" } STDOUT: eth0 lo peerveth0 veth0 7491 1727203984.97891: no more pending results, returning what we have 7491 1727203984.97895: results queue empty 7491 1727203984.97896: checking for any_errors_fatal 7491 1727203984.97899: done checking for any_errors_fatal 7491 1727203984.97899: checking for max_fail_percentage 7491 1727203984.97901: done checking for max_fail_percentage 7491 1727203984.97902: checking to see if all hosts have failed and the running result is not ok 7491 1727203984.97903: done checking to see if all hosts have failed 7491 1727203984.97904: getting the remaining hosts for this loop 7491 1727203984.97906: done getting the remaining hosts for this loop 7491 1727203984.97910: getting the next task for host managed-node3 7491 1727203984.97920: done getting next task for host managed-node3 7491 1727203984.97922: ^ task is: TASK: Set current_interfaces 7491 1727203984.97928: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203984.97933: getting variables 7491 1727203984.97935: in VariableManager get_vars() 7491 1727203984.97992: Calling all_inventory to load vars for managed-node3 7491 1727203984.97997: Calling groups_inventory to load vars for managed-node3 7491 1727203984.97999: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203984.98011: Calling all_plugins_play to load vars for managed-node3 7491 1727203984.98014: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203984.98020: Calling groups_plugins_play to load vars for managed-node3 7491 1727203984.99391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203985.00316: done with get_vars() 7491 1727203985.00337: done getting variables 7491 1727203985.00384: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:53:05 -0400 (0:00:00.333) 0:00:26.928 ***** 7491 1727203985.00410: entering _queue_task() for managed-node3/set_fact 7491 1727203985.00643: worker is 1 (out of 1 available) 7491 1727203985.00655: exiting _queue_task() for managed-node3/set_fact 7491 1727203985.00670: done queuing things up, now waiting for results queue to drain 7491 1727203985.00672: waiting for pending results... 7491 1727203985.00862: running TaskExecutor() for managed-node3/TASK: Set current_interfaces 7491 1727203985.00945: in run() - task 0affcd87-79f5-0a4a-ad01-000000001068 7491 1727203985.00957: variable 'ansible_search_path' from source: unknown 7491 1727203985.00961: variable 'ansible_search_path' from source: unknown 7491 1727203985.00992: calling self._execute() 7491 1727203985.01113: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203985.01125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203985.01139: variable 'omit' from source: magic vars 7491 1727203985.01688: variable 'ansible_distribution_major_version' from source: facts 7491 1727203985.01707: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203985.01721: variable 'omit' from source: magic vars 7491 1727203985.01784: variable 'omit' from source: magic vars 7491 1727203985.01894: variable '_current_interfaces' from source: set_fact 7491 1727203985.01969: variable 'omit' from source: magic vars 7491 1727203985.02019: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203985.02056: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203985.02089: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203985.02110: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203985.02128: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203985.02161: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203985.02171: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203985.02179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203985.02279: Set connection var ansible_timeout to 10 7491 1727203985.02290: Set connection var ansible_pipelining to False 7491 1727203985.02302: Set connection var ansible_shell_type to sh 7491 1727203985.02310: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203985.02322: Set connection var ansible_shell_executable to /bin/sh 7491 1727203985.02328: Set connection var ansible_connection to ssh 7491 1727203985.02355: variable 'ansible_shell_executable' from source: unknown 7491 1727203985.02362: variable 'ansible_connection' from source: unknown 7491 1727203985.02372: variable 'ansible_module_compression' from source: unknown 7491 1727203985.02378: variable 'ansible_shell_type' from source: unknown 7491 1727203985.02383: variable 'ansible_shell_executable' from source: unknown 7491 1727203985.02389: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203985.02399: variable 'ansible_pipelining' from source: unknown 7491 1727203985.02410: variable 'ansible_timeout' from source: unknown 7491 1727203985.02420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203985.02559: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203985.02576: variable 'omit' from source: magic vars 7491 1727203985.02585: starting attempt loop 7491 1727203985.02591: running the handler 7491 1727203985.02605: handler run complete 7491 1727203985.02625: attempt loop complete, returning result 7491 1727203985.02631: _execute() done 7491 1727203985.02637: dumping result to json 7491 1727203985.02644: done dumping result, returning 7491 1727203985.02653: done running TaskExecutor() for managed-node3/TASK: Set current_interfaces [0affcd87-79f5-0a4a-ad01-000000001068] 7491 1727203985.02663: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001068 ok: [managed-node3] => { "ansible_facts": { "current_interfaces": [ "eth0", "lo", "peerveth0", "veth0" ] }, "changed": false } 7491 1727203985.02823: no more pending results, returning what we have 7491 1727203985.02827: results queue empty 7491 1727203985.02828: checking for any_errors_fatal 7491 1727203985.02838: done checking for any_errors_fatal 7491 1727203985.02839: checking for max_fail_percentage 7491 1727203985.02841: done checking for max_fail_percentage 7491 1727203985.02842: checking to see if all hosts have failed and the running result is not ok 7491 1727203985.02843: done checking to see if all hosts have failed 7491 1727203985.02844: getting the remaining hosts for this loop 7491 1727203985.02846: done getting the remaining hosts for this loop 7491 1727203985.02850: getting the next task for host managed-node3 7491 1727203985.02860: done getting next task for host managed-node3 7491 1727203985.02863: ^ task is: TASK: Show current_interfaces 7491 1727203985.02868: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203985.02873: getting variables 7491 1727203985.02875: in VariableManager get_vars() 7491 1727203985.02932: Calling all_inventory to load vars for managed-node3 7491 1727203985.02936: Calling groups_inventory to load vars for managed-node3 7491 1727203985.02938: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203985.02950: Calling all_plugins_play to load vars for managed-node3 7491 1727203985.02953: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203985.02956: Calling groups_plugins_play to load vars for managed-node3 7491 1727203985.04486: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001068 7491 1727203985.04490: WORKER PROCESS EXITING 7491 1727203985.04900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203985.06562: done with get_vars() 7491 1727203985.06594: done getting variables 7491 1727203985.06660: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:53:05 -0400 (0:00:00.062) 0:00:26.990 ***** 7491 1727203985.06696: entering _queue_task() for managed-node3/debug 7491 1727203985.07025: worker is 1 (out of 1 available) 7491 1727203985.07040: exiting _queue_task() for managed-node3/debug 7491 1727203985.07054: done queuing things up, now waiting for results queue to drain 7491 1727203985.07055: waiting for pending results... 7491 1727203985.07352: running TaskExecutor() for managed-node3/TASK: Show current_interfaces 7491 1727203985.07478: in run() - task 0affcd87-79f5-0a4a-ad01-000000001031 7491 1727203985.07507: variable 'ansible_search_path' from source: unknown 7491 1727203985.07519: variable 'ansible_search_path' from source: unknown 7491 1727203985.07562: calling self._execute() 7491 1727203985.07675: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203985.07688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203985.07703: variable 'omit' from source: magic vars 7491 1727203985.08122: variable 'ansible_distribution_major_version' from source: facts 7491 1727203985.08141: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203985.08158: variable 'omit' from source: magic vars 7491 1727203985.08207: variable 'omit' from source: magic vars 7491 1727203985.08312: variable 'current_interfaces' from source: set_fact 7491 1727203985.08346: variable 'omit' from source: magic vars 7491 1727203985.08396: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203985.08435: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203985.08457: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203985.08481: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203985.08495: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203985.08529: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203985.08537: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203985.08543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203985.08649: Set connection var ansible_timeout to 10 7491 1727203985.08660: Set connection var ansible_pipelining to False 7491 1727203985.08672: Set connection var ansible_shell_type to sh 7491 1727203985.08681: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203985.08695: Set connection var ansible_shell_executable to /bin/sh 7491 1727203985.08703: Set connection var ansible_connection to ssh 7491 1727203985.08733: variable 'ansible_shell_executable' from source: unknown 7491 1727203985.08740: variable 'ansible_connection' from source: unknown 7491 1727203985.08748: variable 'ansible_module_compression' from source: unknown 7491 1727203985.08753: variable 'ansible_shell_type' from source: unknown 7491 1727203985.08759: variable 'ansible_shell_executable' from source: unknown 7491 1727203985.08766: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203985.08773: variable 'ansible_pipelining' from source: unknown 7491 1727203985.08779: variable 'ansible_timeout' from source: unknown 7491 1727203985.08786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203985.08928: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203985.08942: variable 'omit' from source: magic vars 7491 1727203985.08951: starting attempt loop 7491 1727203985.08957: running the handler 7491 1727203985.09003: handler run complete 7491 1727203985.09029: attempt loop complete, returning result 7491 1727203985.09037: _execute() done 7491 1727203985.09043: dumping result to json 7491 1727203985.09050: done dumping result, returning 7491 1727203985.09061: done running TaskExecutor() for managed-node3/TASK: Show current_interfaces [0affcd87-79f5-0a4a-ad01-000000001031] 7491 1727203985.09074: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001031 ok: [managed-node3] => {} MSG: current_interfaces: ['eth0', 'lo', 'peerveth0', 'veth0'] 7491 1727203985.09224: no more pending results, returning what we have 7491 1727203985.09229: results queue empty 7491 1727203985.09230: checking for any_errors_fatal 7491 1727203985.09240: done checking for any_errors_fatal 7491 1727203985.09240: checking for max_fail_percentage 7491 1727203985.09242: done checking for max_fail_percentage 7491 1727203985.09243: checking to see if all hosts have failed and the running result is not ok 7491 1727203985.09245: done checking to see if all hosts have failed 7491 1727203985.09245: getting the remaining hosts for this loop 7491 1727203985.09247: done getting the remaining hosts for this loop 7491 1727203985.09251: getting the next task for host managed-node3 7491 1727203985.09260: done getting next task for host managed-node3 7491 1727203985.09267: ^ task is: TASK: Install iproute 7491 1727203985.09270: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203985.09275: getting variables 7491 1727203985.09277: in VariableManager get_vars() 7491 1727203985.09335: Calling all_inventory to load vars for managed-node3 7491 1727203985.09339: Calling groups_inventory to load vars for managed-node3 7491 1727203985.09342: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203985.09354: Calling all_plugins_play to load vars for managed-node3 7491 1727203985.09357: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203985.09360: Calling groups_plugins_play to load vars for managed-node3 7491 1727203985.10484: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001031 7491 1727203985.10488: WORKER PROCESS EXITING 7491 1727203985.11120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203985.12946: done with get_vars() 7491 1727203985.12974: done getting variables 7491 1727203985.13046: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 14:53:05 -0400 (0:00:00.063) 0:00:27.054 ***** 7491 1727203985.13078: entering _queue_task() for managed-node3/package 7491 1727203985.13392: worker is 1 (out of 1 available) 7491 1727203985.13405: exiting _queue_task() for managed-node3/package 7491 1727203985.13421: done queuing things up, now waiting for results queue to drain 7491 1727203985.13422: waiting for pending results... 7491 1727203985.13714: running TaskExecutor() for managed-node3/TASK: Install iproute 7491 1727203985.13832: in run() - task 0affcd87-79f5-0a4a-ad01-000000000e02 7491 1727203985.13853: variable 'ansible_search_path' from source: unknown 7491 1727203985.13865: variable 'ansible_search_path' from source: unknown 7491 1727203985.13906: calling self._execute() 7491 1727203985.14004: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203985.14020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203985.14034: variable 'omit' from source: magic vars 7491 1727203985.14424: variable 'ansible_distribution_major_version' from source: facts 7491 1727203985.14440: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203985.14451: variable 'omit' from source: magic vars 7491 1727203985.14493: variable 'omit' from source: magic vars 7491 1727203985.14698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203985.17066: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203985.17143: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203985.17190: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203985.17239: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203985.17274: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203985.17380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203985.17432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203985.17470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203985.17521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203985.17542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203985.17659: variable '__network_is_ostree' from source: set_fact 7491 1727203985.17678: variable 'omit' from source: magic vars 7491 1727203985.17715: variable 'omit' from source: magic vars 7491 1727203985.17753: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203985.17788: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203985.17810: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203985.17832: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203985.17844: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203985.17877: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203985.17888: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203985.17895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203985.17992: Set connection var ansible_timeout to 10 7491 1727203985.18006: Set connection var ansible_pipelining to False 7491 1727203985.18014: Set connection var ansible_shell_type to sh 7491 1727203985.18027: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203985.18037: Set connection var ansible_shell_executable to /bin/sh 7491 1727203985.18044: Set connection var ansible_connection to ssh 7491 1727203985.18072: variable 'ansible_shell_executable' from source: unknown 7491 1727203985.18078: variable 'ansible_connection' from source: unknown 7491 1727203985.18084: variable 'ansible_module_compression' from source: unknown 7491 1727203985.18089: variable 'ansible_shell_type' from source: unknown 7491 1727203985.18093: variable 'ansible_shell_executable' from source: unknown 7491 1727203985.18098: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203985.18108: variable 'ansible_pipelining' from source: unknown 7491 1727203985.18114: variable 'ansible_timeout' from source: unknown 7491 1727203985.18122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203985.18226: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203985.18242: variable 'omit' from source: magic vars 7491 1727203985.18251: starting attempt loop 7491 1727203985.18257: running the handler 7491 1727203985.18267: variable 'ansible_facts' from source: unknown 7491 1727203985.18273: variable 'ansible_facts' from source: unknown 7491 1727203985.18309: _low_level_execute_command(): starting 7491 1727203985.18325: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203985.19073: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203985.19093: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203985.19109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203985.19132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203985.19181: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203985.19198: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203985.19214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203985.19235: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203985.19248: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203985.19261: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203985.19277: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203985.19294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203985.19312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203985.19329: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203985.19341: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203985.19355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203985.19437: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203985.19463: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203985.19480: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203985.19565: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203985.21125: stdout chunk (state=3): >>>/root <<< 7491 1727203985.21283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203985.21287: stdout chunk (state=3): >>><<< 7491 1727203985.21296: stderr chunk (state=3): >>><<< 7491 1727203985.21320: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203985.21333: _low_level_execute_command(): starting 7491 1727203985.21337: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203985.21319-8838-227138159894110 `" && echo ansible-tmp-1727203985.21319-8838-227138159894110="` echo /root/.ansible/tmp/ansible-tmp-1727203985.21319-8838-227138159894110 `" ) && sleep 0' 7491 1727203985.22546: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203985.23014: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203985.23024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203985.23037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203985.23079: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203985.23084: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203985.23094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203985.23107: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203985.23115: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203985.23120: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203985.23128: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203985.23137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203985.23149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203985.23156: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203985.23162: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203985.23176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203985.23247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203985.23261: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203985.23272: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203985.23442: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203985.25244: stdout chunk (state=3): >>>ansible-tmp-1727203985.21319-8838-227138159894110=/root/.ansible/tmp/ansible-tmp-1727203985.21319-8838-227138159894110 <<< 7491 1727203985.25439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203985.25443: stdout chunk (state=3): >>><<< 7491 1727203985.25456: stderr chunk (state=3): >>><<< 7491 1727203985.25481: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203985.21319-8838-227138159894110=/root/.ansible/tmp/ansible-tmp-1727203985.21319-8838-227138159894110 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203985.25516: variable 'ansible_module_compression' from source: unknown 7491 1727203985.25589: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 7491 1727203985.25637: variable 'ansible_facts' from source: unknown 7491 1727203985.25738: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203985.21319-8838-227138159894110/AnsiballZ_dnf.py 7491 1727203985.25907: Sending initial data 7491 1727203985.25911: Sent initial data (148 bytes) 7491 1727203985.27020: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203985.27033: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203985.27045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203985.27062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203985.27111: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203985.27118: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203985.27132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203985.27146: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203985.27153: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203985.27168: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203985.27175: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203985.27185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203985.27198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203985.27210: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203985.27218: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203985.27236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203985.27314: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203985.27340: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203985.27356: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203985.27436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203985.29092: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203985.29130: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203985.29165: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpy1v4cgul /root/.ansible/tmp/ansible-tmp-1727203985.21319-8838-227138159894110/AnsiballZ_dnf.py <<< 7491 1727203985.29203: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203985.30870: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203985.30957: stderr chunk (state=3): >>><<< 7491 1727203985.30960: stdout chunk (state=3): >>><<< 7491 1727203985.30986: done transferring module to remote 7491 1727203985.30995: _low_level_execute_command(): starting 7491 1727203985.31000: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203985.21319-8838-227138159894110/ /root/.ansible/tmp/ansible-tmp-1727203985.21319-8838-227138159894110/AnsiballZ_dnf.py && sleep 0' 7491 1727203985.31691: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203985.31700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203985.31710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203985.31726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203985.31766: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203985.31774: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203985.31784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203985.31798: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203985.31805: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203985.31812: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203985.31822: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203985.31832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203985.31843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203985.31851: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203985.31859: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203985.31867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203985.31943: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203985.31960: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203985.31974: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203985.32040: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203985.33773: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203985.33776: stdout chunk (state=3): >>><<< 7491 1727203985.33783: stderr chunk (state=3): >>><<< 7491 1727203985.33800: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203985.33804: _low_level_execute_command(): starting 7491 1727203985.33810: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203985.21319-8838-227138159894110/AnsiballZ_dnf.py && sleep 0' 7491 1727203985.34487: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203985.34496: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203985.34506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203985.34522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203985.34567: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203985.34575: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203985.34584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203985.34597: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203985.34605: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203985.34612: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203985.34621: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203985.34628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203985.34640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203985.34649: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203985.34658: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203985.34670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203985.34740: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203985.34758: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203985.34776: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203985.34847: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203986.26235: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 7491 1727203986.30285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203986.30347: stderr chunk (state=3): >>><<< 7491 1727203986.30351: stdout chunk (state=3): >>><<< 7491 1727203986.30368: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203986.30405: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203985.21319-8838-227138159894110/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203986.30413: _low_level_execute_command(): starting 7491 1727203986.30420: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203985.21319-8838-227138159894110/ > /dev/null 2>&1 && sleep 0' 7491 1727203986.30894: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203986.30898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203986.30945: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203986.30949: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203986.30951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203986.31004: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203986.31012: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203986.31027: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203986.31076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203986.32839: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203986.32895: stderr chunk (state=3): >>><<< 7491 1727203986.32900: stdout chunk (state=3): >>><<< 7491 1727203986.32916: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203986.32925: handler run complete 7491 1727203986.33045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727203986.33173: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727203986.33205: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727203986.33233: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727203986.33271: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727203986.33328: variable '__install_status' from source: set_fact 7491 1727203986.33343: Evaluated conditional (__install_status is success): True 7491 1727203986.33355: attempt loop complete, returning result 7491 1727203986.33358: _execute() done 7491 1727203986.33360: dumping result to json 7491 1727203986.33368: done dumping result, returning 7491 1727203986.33374: done running TaskExecutor() for managed-node3/TASK: Install iproute [0affcd87-79f5-0a4a-ad01-000000000e02] 7491 1727203986.33379: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000e02 7491 1727203986.33475: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000e02 7491 1727203986.33478: WORKER PROCESS EXITING ok: [managed-node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 7491 1727203986.33557: no more pending results, returning what we have 7491 1727203986.33560: results queue empty 7491 1727203986.33561: checking for any_errors_fatal 7491 1727203986.33569: done checking for any_errors_fatal 7491 1727203986.33570: checking for max_fail_percentage 7491 1727203986.33572: done checking for max_fail_percentage 7491 1727203986.33573: checking to see if all hosts have failed and the running result is not ok 7491 1727203986.33574: done checking to see if all hosts have failed 7491 1727203986.33575: getting the remaining hosts for this loop 7491 1727203986.33577: done getting the remaining hosts for this loop 7491 1727203986.33580: getting the next task for host managed-node3 7491 1727203986.33586: done getting next task for host managed-node3 7491 1727203986.33588: ^ task is: TASK: Create veth interface {{ interface }} 7491 1727203986.33590: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203986.33595: getting variables 7491 1727203986.33597: in VariableManager get_vars() 7491 1727203986.33644: Calling all_inventory to load vars for managed-node3 7491 1727203986.33647: Calling groups_inventory to load vars for managed-node3 7491 1727203986.33649: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203986.33659: Calling all_plugins_play to load vars for managed-node3 7491 1727203986.33661: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203986.33671: Calling groups_plugins_play to load vars for managed-node3 7491 1727203986.34522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203986.35446: done with get_vars() 7491 1727203986.35466: done getting variables 7491 1727203986.35510: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203986.35604: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 14:53:06 -0400 (0:00:01.225) 0:00:28.280 ***** 7491 1727203986.35631: entering _queue_task() for managed-node3/command 7491 1727203986.35853: worker is 1 (out of 1 available) 7491 1727203986.35870: exiting _queue_task() for managed-node3/command 7491 1727203986.35885: done queuing things up, now waiting for results queue to drain 7491 1727203986.35886: waiting for pending results... 7491 1727203986.36068: running TaskExecutor() for managed-node3/TASK: Create veth interface veth0 7491 1727203986.36140: in run() - task 0affcd87-79f5-0a4a-ad01-000000000e03 7491 1727203986.36151: variable 'ansible_search_path' from source: unknown 7491 1727203986.36155: variable 'ansible_search_path' from source: unknown 7491 1727203986.36366: variable 'interface' from source: play vars 7491 1727203986.36428: variable 'interface' from source: play vars 7491 1727203986.36481: variable 'interface' from source: play vars 7491 1727203986.36601: Loaded config def from plugin (lookup/items) 7491 1727203986.36605: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 7491 1727203986.36624: variable 'omit' from source: magic vars 7491 1727203986.36726: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203986.36732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203986.36741: variable 'omit' from source: magic vars 7491 1727203986.36920: variable 'ansible_distribution_major_version' from source: facts 7491 1727203986.36924: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203986.37056: variable 'type' from source: play vars 7491 1727203986.37060: variable 'state' from source: include params 7491 1727203986.37062: variable 'interface' from source: play vars 7491 1727203986.37067: variable 'current_interfaces' from source: set_fact 7491 1727203986.37075: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 7491 1727203986.37078: when evaluation is False, skipping this task 7491 1727203986.37099: variable 'item' from source: unknown 7491 1727203986.37149: variable 'item' from source: unknown skipping: [managed-node3] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link add veth0 type veth peer name peerveth0", "skip_reason": "Conditional result was False" } 7491 1727203986.37307: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203986.37310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203986.37312: variable 'omit' from source: magic vars 7491 1727203986.37366: variable 'ansible_distribution_major_version' from source: facts 7491 1727203986.37370: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203986.37487: variable 'type' from source: play vars 7491 1727203986.37491: variable 'state' from source: include params 7491 1727203986.37493: variable 'interface' from source: play vars 7491 1727203986.37498: variable 'current_interfaces' from source: set_fact 7491 1727203986.37504: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 7491 1727203986.37506: when evaluation is False, skipping this task 7491 1727203986.37528: variable 'item' from source: unknown 7491 1727203986.37573: variable 'item' from source: unknown skipping: [managed-node3] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set peerveth0 up", "skip_reason": "Conditional result was False" } 7491 1727203986.37653: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203986.37657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203986.37659: variable 'omit' from source: magic vars 7491 1727203986.37754: variable 'ansible_distribution_major_version' from source: facts 7491 1727203986.37758: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203986.37872: variable 'type' from source: play vars 7491 1727203986.37879: variable 'state' from source: include params 7491 1727203986.37882: variable 'interface' from source: play vars 7491 1727203986.37887: variable 'current_interfaces' from source: set_fact 7491 1727203986.37892: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 7491 1727203986.37895: when evaluation is False, skipping this task 7491 1727203986.37914: variable 'item' from source: unknown 7491 1727203986.37958: variable 'item' from source: unknown skipping: [managed-node3] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set veth0 up", "skip_reason": "Conditional result was False" } 7491 1727203986.38031: dumping result to json 7491 1727203986.38034: done dumping result, returning 7491 1727203986.38036: done running TaskExecutor() for managed-node3/TASK: Create veth interface veth0 [0affcd87-79f5-0a4a-ad01-000000000e03] 7491 1727203986.38038: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000e03 7491 1727203986.38073: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000e03 7491 1727203986.38080: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false } MSG: All items skipped 7491 1727203986.38114: no more pending results, returning what we have 7491 1727203986.38118: results queue empty 7491 1727203986.38120: checking for any_errors_fatal 7491 1727203986.38128: done checking for any_errors_fatal 7491 1727203986.38128: checking for max_fail_percentage 7491 1727203986.38130: done checking for max_fail_percentage 7491 1727203986.38131: checking to see if all hosts have failed and the running result is not ok 7491 1727203986.38132: done checking to see if all hosts have failed 7491 1727203986.38132: getting the remaining hosts for this loop 7491 1727203986.38134: done getting the remaining hosts for this loop 7491 1727203986.38138: getting the next task for host managed-node3 7491 1727203986.38144: done getting next task for host managed-node3 7491 1727203986.38145: ^ task is: TASK: Set up veth as managed by NetworkManager 7491 1727203986.38148: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203986.38152: getting variables 7491 1727203986.38154: in VariableManager get_vars() 7491 1727203986.38209: Calling all_inventory to load vars for managed-node3 7491 1727203986.38212: Calling groups_inventory to load vars for managed-node3 7491 1727203986.38214: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203986.38224: Calling all_plugins_play to load vars for managed-node3 7491 1727203986.38227: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203986.38229: Calling groups_plugins_play to load vars for managed-node3 7491 1727203986.39169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203986.40103: done with get_vars() 7491 1727203986.40121: done getting variables 7491 1727203986.40167: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 14:53:06 -0400 (0:00:00.045) 0:00:28.325 ***** 7491 1727203986.40192: entering _queue_task() for managed-node3/command 7491 1727203986.40405: worker is 1 (out of 1 available) 7491 1727203986.40421: exiting _queue_task() for managed-node3/command 7491 1727203986.40436: done queuing things up, now waiting for results queue to drain 7491 1727203986.40437: waiting for pending results... 7491 1727203986.40639: running TaskExecutor() for managed-node3/TASK: Set up veth as managed by NetworkManager 7491 1727203986.40784: in run() - task 0affcd87-79f5-0a4a-ad01-000000000e04 7491 1727203986.40823: variable 'ansible_search_path' from source: unknown 7491 1727203986.40831: variable 'ansible_search_path' from source: unknown 7491 1727203986.40883: calling self._execute() 7491 1727203986.40989: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203986.41001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203986.41018: variable 'omit' from source: magic vars 7491 1727203986.41399: variable 'ansible_distribution_major_version' from source: facts 7491 1727203986.41417: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203986.41668: variable 'type' from source: play vars 7491 1727203986.41704: variable 'state' from source: include params 7491 1727203986.41720: Evaluated conditional (type == 'veth' and state == 'present'): False 7491 1727203986.41727: when evaluation is False, skipping this task 7491 1727203986.41734: _execute() done 7491 1727203986.41739: dumping result to json 7491 1727203986.41747: done dumping result, returning 7491 1727203986.41765: done running TaskExecutor() for managed-node3/TASK: Set up veth as managed by NetworkManager [0affcd87-79f5-0a4a-ad01-000000000e04] 7491 1727203986.41778: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000e04 7491 1727203986.41924: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000e04 7491 1727203986.41926: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'present'", "skip_reason": "Conditional result was False" } 7491 1727203986.41974: no more pending results, returning what we have 7491 1727203986.41978: results queue empty 7491 1727203986.41979: checking for any_errors_fatal 7491 1727203986.41991: done checking for any_errors_fatal 7491 1727203986.41992: checking for max_fail_percentage 7491 1727203986.42008: done checking for max_fail_percentage 7491 1727203986.42009: checking to see if all hosts have failed and the running result is not ok 7491 1727203986.42010: done checking to see if all hosts have failed 7491 1727203986.42011: getting the remaining hosts for this loop 7491 1727203986.42013: done getting the remaining hosts for this loop 7491 1727203986.42017: getting the next task for host managed-node3 7491 1727203986.42022: done getting next task for host managed-node3 7491 1727203986.42024: ^ task is: TASK: Delete veth interface {{ interface }} 7491 1727203986.42027: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203986.42030: getting variables 7491 1727203986.42032: in VariableManager get_vars() 7491 1727203986.42077: Calling all_inventory to load vars for managed-node3 7491 1727203986.42080: Calling groups_inventory to load vars for managed-node3 7491 1727203986.42081: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203986.42092: Calling all_plugins_play to load vars for managed-node3 7491 1727203986.42094: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203986.42097: Calling groups_plugins_play to load vars for managed-node3 7491 1727203986.42908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203986.44509: done with get_vars() 7491 1727203986.44541: done getting variables 7491 1727203986.44604: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203986.44727: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 14:53:06 -0400 (0:00:00.045) 0:00:28.371 ***** 7491 1727203986.44758: entering _queue_task() for managed-node3/command 7491 1727203986.45066: worker is 1 (out of 1 available) 7491 1727203986.45078: exiting _queue_task() for managed-node3/command 7491 1727203986.45091: done queuing things up, now waiting for results queue to drain 7491 1727203986.45092: waiting for pending results... 7491 1727203986.45381: running TaskExecutor() for managed-node3/TASK: Delete veth interface veth0 7491 1727203986.45498: in run() - task 0affcd87-79f5-0a4a-ad01-000000000e05 7491 1727203986.45521: variable 'ansible_search_path' from source: unknown 7491 1727203986.45534: variable 'ansible_search_path' from source: unknown 7491 1727203986.45576: calling self._execute() 7491 1727203986.45681: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203986.45692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203986.45705: variable 'omit' from source: magic vars 7491 1727203986.46085: variable 'ansible_distribution_major_version' from source: facts 7491 1727203986.46102: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203986.46324: variable 'type' from source: play vars 7491 1727203986.46334: variable 'state' from source: include params 7491 1727203986.46343: variable 'interface' from source: play vars 7491 1727203986.46350: variable 'current_interfaces' from source: set_fact 7491 1727203986.46362: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): True 7491 1727203986.46375: variable 'omit' from source: magic vars 7491 1727203986.46426: variable 'omit' from source: magic vars 7491 1727203986.46536: variable 'interface' from source: play vars 7491 1727203986.46557: variable 'omit' from source: magic vars 7491 1727203986.46604: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203986.46649: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203986.46678: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203986.46699: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203986.46713: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203986.46753: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203986.46762: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203986.46771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203986.46883: Set connection var ansible_timeout to 10 7491 1727203986.46895: Set connection var ansible_pipelining to False 7491 1727203986.46905: Set connection var ansible_shell_type to sh 7491 1727203986.46914: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203986.46930: Set connection var ansible_shell_executable to /bin/sh 7491 1727203986.46947: Set connection var ansible_connection to ssh 7491 1727203986.46980: variable 'ansible_shell_executable' from source: unknown 7491 1727203986.46990: variable 'ansible_connection' from source: unknown 7491 1727203986.46999: variable 'ansible_module_compression' from source: unknown 7491 1727203986.47007: variable 'ansible_shell_type' from source: unknown 7491 1727203986.47016: variable 'ansible_shell_executable' from source: unknown 7491 1727203986.47029: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203986.47039: variable 'ansible_pipelining' from source: unknown 7491 1727203986.47050: variable 'ansible_timeout' from source: unknown 7491 1727203986.47058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203986.47207: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203986.47226: variable 'omit' from source: magic vars 7491 1727203986.47235: starting attempt loop 7491 1727203986.47241: running the handler 7491 1727203986.47259: _low_level_execute_command(): starting 7491 1727203986.47277: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203986.48077: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203986.48092: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203986.48105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203986.48129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203986.48180: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203986.48192: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203986.48206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203986.48226: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203986.48238: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203986.48248: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203986.48263: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203986.48279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203986.48297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203986.48311: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203986.48324: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203986.48337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203986.48421: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203986.48445: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203986.48463: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203986.48539: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203986.50088: stdout chunk (state=3): >>>/root <<< 7491 1727203986.50292: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203986.50296: stdout chunk (state=3): >>><<< 7491 1727203986.50298: stderr chunk (state=3): >>><<< 7491 1727203986.50429: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203986.50442: _low_level_execute_command(): starting 7491 1727203986.50444: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203986.5032182-8893-116582446060108 `" && echo ansible-tmp-1727203986.5032182-8893-116582446060108="` echo /root/.ansible/tmp/ansible-tmp-1727203986.5032182-8893-116582446060108 `" ) && sleep 0' 7491 1727203986.51128: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203986.51144: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203986.51160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203986.51182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203986.51241: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203986.51253: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203986.51271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203986.51291: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203986.51311: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203986.51329: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203986.51342: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203986.51356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203986.51374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203986.51388: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203986.51400: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203986.51425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203986.51504: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203986.51538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203986.51561: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203986.51661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203986.53456: stdout chunk (state=3): >>>ansible-tmp-1727203986.5032182-8893-116582446060108=/root/.ansible/tmp/ansible-tmp-1727203986.5032182-8893-116582446060108 <<< 7491 1727203986.53587: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203986.53681: stderr chunk (state=3): >>><<< 7491 1727203986.53686: stdout chunk (state=3): >>><<< 7491 1727203986.53715: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203986.5032182-8893-116582446060108=/root/.ansible/tmp/ansible-tmp-1727203986.5032182-8893-116582446060108 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203986.53748: variable 'ansible_module_compression' from source: unknown 7491 1727203986.53810: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7491 1727203986.53843: variable 'ansible_facts' from source: unknown 7491 1727203986.53932: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203986.5032182-8893-116582446060108/AnsiballZ_command.py 7491 1727203986.54083: Sending initial data 7491 1727203986.54086: Sent initial data (154 bytes) 7491 1727203986.55071: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203986.55084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203986.55089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203986.55102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203986.55143: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203986.55160: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203986.55165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203986.55176: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203986.55194: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203986.55198: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203986.55200: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203986.55206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203986.55226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203986.55229: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203986.55235: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203986.55254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203986.55315: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203986.55333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203986.55348: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203986.55420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203986.57134: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203986.57140: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203986.57172: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpwf0uku80 /root/.ansible/tmp/ansible-tmp-1727203986.5032182-8893-116582446060108/AnsiballZ_command.py <<< 7491 1727203986.57224: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203986.58306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203986.58436: stderr chunk (state=3): >>><<< 7491 1727203986.58439: stdout chunk (state=3): >>><<< 7491 1727203986.58459: done transferring module to remote 7491 1727203986.58470: _low_level_execute_command(): starting 7491 1727203986.58476: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203986.5032182-8893-116582446060108/ /root/.ansible/tmp/ansible-tmp-1727203986.5032182-8893-116582446060108/AnsiballZ_command.py && sleep 0' 7491 1727203986.58937: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203986.58943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203986.58993: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203986.58997: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203986.59000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203986.59052: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203986.59055: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203986.59105: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203986.60783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203986.60868: stderr chunk (state=3): >>><<< 7491 1727203986.60871: stdout chunk (state=3): >>><<< 7491 1727203986.60973: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203986.60977: _low_level_execute_command(): starting 7491 1727203986.60980: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203986.5032182-8893-116582446060108/AnsiballZ_command.py && sleep 0' 7491 1727203986.61590: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203986.61605: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203986.61628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203986.61648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203986.61692: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203986.61705: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203986.61722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203986.61748: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203986.61760: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203986.61774: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203986.61787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203986.61802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203986.61820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203986.61833: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203986.61850: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203986.61866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203986.61941: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203986.61962: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203986.61979: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203986.62089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203986.76919: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-24 14:53:06.749304", "end": "2024-09-24 14:53:06.768119", "delta": "0:00:00.018815", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7491 1727203986.78091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203986.78151: stderr chunk (state=3): >>><<< 7491 1727203986.78155: stdout chunk (state=3): >>><<< 7491 1727203986.78174: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-24 14:53:06.749304", "end": "2024-09-24 14:53:06.768119", "delta": "0:00:00.018815", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203986.78207: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del veth0 type veth', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203986.5032182-8893-116582446060108/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203986.78214: _low_level_execute_command(): starting 7491 1727203986.78222: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203986.5032182-8893-116582446060108/ > /dev/null 2>&1 && sleep 0' 7491 1727203986.78707: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203986.78711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203986.78730: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203986.78743: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203986.78792: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203986.78806: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203986.78860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203986.80611: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203986.80670: stderr chunk (state=3): >>><<< 7491 1727203986.80674: stdout chunk (state=3): >>><<< 7491 1727203986.80694: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203986.80699: handler run complete 7491 1727203986.80721: Evaluated conditional (False): False 7491 1727203986.80727: attempt loop complete, returning result 7491 1727203986.80730: _execute() done 7491 1727203986.80732: dumping result to json 7491 1727203986.80740: done dumping result, returning 7491 1727203986.80745: done running TaskExecutor() for managed-node3/TASK: Delete veth interface veth0 [0affcd87-79f5-0a4a-ad01-000000000e05] 7491 1727203986.80752: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000e05 7491 1727203986.80846: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000e05 7491 1727203986.80850: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "link", "del", "veth0", "type", "veth" ], "delta": "0:00:00.018815", "end": "2024-09-24 14:53:06.768119", "rc": 0, "start": "2024-09-24 14:53:06.749304" } 7491 1727203986.80913: no more pending results, returning what we have 7491 1727203986.80916: results queue empty 7491 1727203986.80917: checking for any_errors_fatal 7491 1727203986.80925: done checking for any_errors_fatal 7491 1727203986.80926: checking for max_fail_percentage 7491 1727203986.80927: done checking for max_fail_percentage 7491 1727203986.80928: checking to see if all hosts have failed and the running result is not ok 7491 1727203986.80929: done checking to see if all hosts have failed 7491 1727203986.80930: getting the remaining hosts for this loop 7491 1727203986.80932: done getting the remaining hosts for this loop 7491 1727203986.80935: getting the next task for host managed-node3 7491 1727203986.80942: done getting next task for host managed-node3 7491 1727203986.80944: ^ task is: TASK: Create dummy interface {{ interface }} 7491 1727203986.80946: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203986.80950: getting variables 7491 1727203986.80953: in VariableManager get_vars() 7491 1727203986.81007: Calling all_inventory to load vars for managed-node3 7491 1727203986.81010: Calling groups_inventory to load vars for managed-node3 7491 1727203986.81013: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203986.81023: Calling all_plugins_play to load vars for managed-node3 7491 1727203986.81026: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203986.81029: Calling groups_plugins_play to load vars for managed-node3 7491 1727203986.82006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203986.82926: done with get_vars() 7491 1727203986.82946: done getting variables 7491 1727203986.82993: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203986.83081: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 14:53:06 -0400 (0:00:00.383) 0:00:28.754 ***** 7491 1727203986.83104: entering _queue_task() for managed-node3/command 7491 1727203986.83339: worker is 1 (out of 1 available) 7491 1727203986.83353: exiting _queue_task() for managed-node3/command 7491 1727203986.83368: done queuing things up, now waiting for results queue to drain 7491 1727203986.83370: waiting for pending results... 7491 1727203986.83556: running TaskExecutor() for managed-node3/TASK: Create dummy interface veth0 7491 1727203986.83640: in run() - task 0affcd87-79f5-0a4a-ad01-000000000e06 7491 1727203986.83650: variable 'ansible_search_path' from source: unknown 7491 1727203986.83654: variable 'ansible_search_path' from source: unknown 7491 1727203986.83685: calling self._execute() 7491 1727203986.83762: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203986.83769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203986.83778: variable 'omit' from source: magic vars 7491 1727203986.84060: variable 'ansible_distribution_major_version' from source: facts 7491 1727203986.84071: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203986.84212: variable 'type' from source: play vars 7491 1727203986.84215: variable 'state' from source: include params 7491 1727203986.84223: variable 'interface' from source: play vars 7491 1727203986.84232: variable 'current_interfaces' from source: set_fact 7491 1727203986.84240: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 7491 1727203986.84243: when evaluation is False, skipping this task 7491 1727203986.84246: _execute() done 7491 1727203986.84249: dumping result to json 7491 1727203986.84251: done dumping result, returning 7491 1727203986.84255: done running TaskExecutor() for managed-node3/TASK: Create dummy interface veth0 [0affcd87-79f5-0a4a-ad01-000000000e06] 7491 1727203986.84266: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000e06 7491 1727203986.84351: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000e06 7491 1727203986.84354: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7491 1727203986.84409: no more pending results, returning what we have 7491 1727203986.84413: results queue empty 7491 1727203986.84414: checking for any_errors_fatal 7491 1727203986.84425: done checking for any_errors_fatal 7491 1727203986.84425: checking for max_fail_percentage 7491 1727203986.84427: done checking for max_fail_percentage 7491 1727203986.84428: checking to see if all hosts have failed and the running result is not ok 7491 1727203986.84429: done checking to see if all hosts have failed 7491 1727203986.84430: getting the remaining hosts for this loop 7491 1727203986.84432: done getting the remaining hosts for this loop 7491 1727203986.84436: getting the next task for host managed-node3 7491 1727203986.84442: done getting next task for host managed-node3 7491 1727203986.84445: ^ task is: TASK: Delete dummy interface {{ interface }} 7491 1727203986.84447: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203986.84452: getting variables 7491 1727203986.84453: in VariableManager get_vars() 7491 1727203986.84506: Calling all_inventory to load vars for managed-node3 7491 1727203986.84509: Calling groups_inventory to load vars for managed-node3 7491 1727203986.84510: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203986.84524: Calling all_plugins_play to load vars for managed-node3 7491 1727203986.84527: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203986.84529: Calling groups_plugins_play to load vars for managed-node3 7491 1727203986.85357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203986.86411: done with get_vars() 7491 1727203986.86433: done getting variables 7491 1727203986.86482: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203986.86572: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 14:53:06 -0400 (0:00:00.034) 0:00:28.789 ***** 7491 1727203986.86595: entering _queue_task() for managed-node3/command 7491 1727203986.86841: worker is 1 (out of 1 available) 7491 1727203986.86854: exiting _queue_task() for managed-node3/command 7491 1727203986.86870: done queuing things up, now waiting for results queue to drain 7491 1727203986.86871: waiting for pending results... 7491 1727203986.87054: running TaskExecutor() for managed-node3/TASK: Delete dummy interface veth0 7491 1727203986.87125: in run() - task 0affcd87-79f5-0a4a-ad01-000000000e07 7491 1727203986.87138: variable 'ansible_search_path' from source: unknown 7491 1727203986.87141: variable 'ansible_search_path' from source: unknown 7491 1727203986.87173: calling self._execute() 7491 1727203986.87251: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203986.87254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203986.87267: variable 'omit' from source: magic vars 7491 1727203986.87540: variable 'ansible_distribution_major_version' from source: facts 7491 1727203986.87550: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203986.87687: variable 'type' from source: play vars 7491 1727203986.87692: variable 'state' from source: include params 7491 1727203986.87697: variable 'interface' from source: play vars 7491 1727203986.87700: variable 'current_interfaces' from source: set_fact 7491 1727203986.87708: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 7491 1727203986.87713: when evaluation is False, skipping this task 7491 1727203986.87715: _execute() done 7491 1727203986.87722: dumping result to json 7491 1727203986.87725: done dumping result, returning 7491 1727203986.87727: done running TaskExecutor() for managed-node3/TASK: Delete dummy interface veth0 [0affcd87-79f5-0a4a-ad01-000000000e07] 7491 1727203986.87730: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000e07 7491 1727203986.87818: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000e07 7491 1727203986.87821: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7491 1727203986.87896: no more pending results, returning what we have 7491 1727203986.87900: results queue empty 7491 1727203986.87901: checking for any_errors_fatal 7491 1727203986.87907: done checking for any_errors_fatal 7491 1727203986.87908: checking for max_fail_percentage 7491 1727203986.87910: done checking for max_fail_percentage 7491 1727203986.87910: checking to see if all hosts have failed and the running result is not ok 7491 1727203986.87912: done checking to see if all hosts have failed 7491 1727203986.87912: getting the remaining hosts for this loop 7491 1727203986.87914: done getting the remaining hosts for this loop 7491 1727203986.87921: getting the next task for host managed-node3 7491 1727203986.87926: done getting next task for host managed-node3 7491 1727203986.87928: ^ task is: TASK: Create tap interface {{ interface }} 7491 1727203986.87930: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203986.87933: getting variables 7491 1727203986.87935: in VariableManager get_vars() 7491 1727203986.87985: Calling all_inventory to load vars for managed-node3 7491 1727203986.87988: Calling groups_inventory to load vars for managed-node3 7491 1727203986.87990: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203986.88001: Calling all_plugins_play to load vars for managed-node3 7491 1727203986.88004: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203986.88006: Calling groups_plugins_play to load vars for managed-node3 7491 1727203986.88805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203986.89729: done with get_vars() 7491 1727203986.89749: done getting variables 7491 1727203986.89796: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203986.89885: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 14:53:06 -0400 (0:00:00.033) 0:00:28.823 ***** 7491 1727203986.89911: entering _queue_task() for managed-node3/command 7491 1727203986.90149: worker is 1 (out of 1 available) 7491 1727203986.90162: exiting _queue_task() for managed-node3/command 7491 1727203986.90178: done queuing things up, now waiting for results queue to drain 7491 1727203986.90179: waiting for pending results... 7491 1727203986.90360: running TaskExecutor() for managed-node3/TASK: Create tap interface veth0 7491 1727203986.90442: in run() - task 0affcd87-79f5-0a4a-ad01-000000000e08 7491 1727203986.90457: variable 'ansible_search_path' from source: unknown 7491 1727203986.90460: variable 'ansible_search_path' from source: unknown 7491 1727203986.90491: calling self._execute() 7491 1727203986.90572: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203986.90576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203986.90583: variable 'omit' from source: magic vars 7491 1727203986.90861: variable 'ansible_distribution_major_version' from source: facts 7491 1727203986.90873: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203986.91013: variable 'type' from source: play vars 7491 1727203986.91019: variable 'state' from source: include params 7491 1727203986.91023: variable 'interface' from source: play vars 7491 1727203986.91026: variable 'current_interfaces' from source: set_fact 7491 1727203986.91033: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 7491 1727203986.91036: when evaluation is False, skipping this task 7491 1727203986.91039: _execute() done 7491 1727203986.91041: dumping result to json 7491 1727203986.91044: done dumping result, returning 7491 1727203986.91049: done running TaskExecutor() for managed-node3/TASK: Create tap interface veth0 [0affcd87-79f5-0a4a-ad01-000000000e08] 7491 1727203986.91057: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000e08 7491 1727203986.91145: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000e08 7491 1727203986.91147: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7491 1727203986.91209: no more pending results, returning what we have 7491 1727203986.91213: results queue empty 7491 1727203986.91214: checking for any_errors_fatal 7491 1727203986.91221: done checking for any_errors_fatal 7491 1727203986.91222: checking for max_fail_percentage 7491 1727203986.91223: done checking for max_fail_percentage 7491 1727203986.91224: checking to see if all hosts have failed and the running result is not ok 7491 1727203986.91226: done checking to see if all hosts have failed 7491 1727203986.91226: getting the remaining hosts for this loop 7491 1727203986.91228: done getting the remaining hosts for this loop 7491 1727203986.91233: getting the next task for host managed-node3 7491 1727203986.91239: done getting next task for host managed-node3 7491 1727203986.91242: ^ task is: TASK: Delete tap interface {{ interface }} 7491 1727203986.91245: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203986.91249: getting variables 7491 1727203986.91251: in VariableManager get_vars() 7491 1727203986.91305: Calling all_inventory to load vars for managed-node3 7491 1727203986.91308: Calling groups_inventory to load vars for managed-node3 7491 1727203986.91310: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203986.91322: Calling all_plugins_play to load vars for managed-node3 7491 1727203986.91324: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203986.91327: Calling groups_plugins_play to load vars for managed-node3 7491 1727203986.92294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203986.97370: done with get_vars() 7491 1727203986.97393: done getting variables 7491 1727203986.97436: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203986.97508: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 14:53:06 -0400 (0:00:00.076) 0:00:28.899 ***** 7491 1727203986.97530: entering _queue_task() for managed-node3/command 7491 1727203986.97765: worker is 1 (out of 1 available) 7491 1727203986.97779: exiting _queue_task() for managed-node3/command 7491 1727203986.97793: done queuing things up, now waiting for results queue to drain 7491 1727203986.97795: waiting for pending results... 7491 1727203986.97989: running TaskExecutor() for managed-node3/TASK: Delete tap interface veth0 7491 1727203986.98077: in run() - task 0affcd87-79f5-0a4a-ad01-000000000e09 7491 1727203986.98088: variable 'ansible_search_path' from source: unknown 7491 1727203986.98094: variable 'ansible_search_path' from source: unknown 7491 1727203986.98126: calling self._execute() 7491 1727203986.98204: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203986.98208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203986.98218: variable 'omit' from source: magic vars 7491 1727203986.98515: variable 'ansible_distribution_major_version' from source: facts 7491 1727203986.98527: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203986.98670: variable 'type' from source: play vars 7491 1727203986.98675: variable 'state' from source: include params 7491 1727203986.98679: variable 'interface' from source: play vars 7491 1727203986.98682: variable 'current_interfaces' from source: set_fact 7491 1727203986.98690: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 7491 1727203986.98693: when evaluation is False, skipping this task 7491 1727203986.98696: _execute() done 7491 1727203986.98698: dumping result to json 7491 1727203986.98702: done dumping result, returning 7491 1727203986.98707: done running TaskExecutor() for managed-node3/TASK: Delete tap interface veth0 [0affcd87-79f5-0a4a-ad01-000000000e09] 7491 1727203986.98721: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000e09 7491 1727203986.98809: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000e09 7491 1727203986.98813: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7491 1727203986.98868: no more pending results, returning what we have 7491 1727203986.98872: results queue empty 7491 1727203986.98873: checking for any_errors_fatal 7491 1727203986.98881: done checking for any_errors_fatal 7491 1727203986.98881: checking for max_fail_percentage 7491 1727203986.98883: done checking for max_fail_percentage 7491 1727203986.98884: checking to see if all hosts have failed and the running result is not ok 7491 1727203986.98885: done checking to see if all hosts have failed 7491 1727203986.98886: getting the remaining hosts for this loop 7491 1727203986.98888: done getting the remaining hosts for this loop 7491 1727203986.98892: getting the next task for host managed-node3 7491 1727203986.98900: done getting next task for host managed-node3 7491 1727203986.98903: ^ task is: TASK: TEST: I can configure an interface with auto_gateway disabled 7491 1727203986.98905: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203986.98909: getting variables 7491 1727203986.98911: in VariableManager get_vars() 7491 1727203986.98968: Calling all_inventory to load vars for managed-node3 7491 1727203986.98970: Calling groups_inventory to load vars for managed-node3 7491 1727203986.98972: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203986.98983: Calling all_plugins_play to load vars for managed-node3 7491 1727203986.98986: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203986.98989: Calling groups_plugins_play to load vars for managed-node3 7491 1727203986.99794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203987.00728: done with get_vars() 7491 1727203987.00747: done getting variables 7491 1727203987.00795: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST: I can configure an interface with auto_gateway disabled] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:83 Tuesday 24 September 2024 14:53:07 -0400 (0:00:00.032) 0:00:28.932 ***** 7491 1727203987.00820: entering _queue_task() for managed-node3/debug 7491 1727203987.01050: worker is 1 (out of 1 available) 7491 1727203987.01065: exiting _queue_task() for managed-node3/debug 7491 1727203987.01078: done queuing things up, now waiting for results queue to drain 7491 1727203987.01080: waiting for pending results... 7491 1727203987.01261: running TaskExecutor() for managed-node3/TASK: TEST: I can configure an interface with auto_gateway disabled 7491 1727203987.01322: in run() - task 0affcd87-79f5-0a4a-ad01-0000000000af 7491 1727203987.01341: variable 'ansible_search_path' from source: unknown 7491 1727203987.01374: calling self._execute() 7491 1727203987.01453: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203987.01458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203987.01470: variable 'omit' from source: magic vars 7491 1727203987.01759: variable 'ansible_distribution_major_version' from source: facts 7491 1727203987.01770: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203987.01777: variable 'omit' from source: magic vars 7491 1727203987.01793: variable 'omit' from source: magic vars 7491 1727203987.01820: variable 'omit' from source: magic vars 7491 1727203987.01853: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203987.01885: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203987.01902: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203987.01918: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203987.01926: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203987.01950: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203987.01954: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203987.01957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203987.02030: Set connection var ansible_timeout to 10 7491 1727203987.02036: Set connection var ansible_pipelining to False 7491 1727203987.02041: Set connection var ansible_shell_type to sh 7491 1727203987.02046: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203987.02053: Set connection var ansible_shell_executable to /bin/sh 7491 1727203987.02058: Set connection var ansible_connection to ssh 7491 1727203987.02082: variable 'ansible_shell_executable' from source: unknown 7491 1727203987.02085: variable 'ansible_connection' from source: unknown 7491 1727203987.02088: variable 'ansible_module_compression' from source: unknown 7491 1727203987.02090: variable 'ansible_shell_type' from source: unknown 7491 1727203987.02093: variable 'ansible_shell_executable' from source: unknown 7491 1727203987.02095: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203987.02097: variable 'ansible_pipelining' from source: unknown 7491 1727203987.02100: variable 'ansible_timeout' from source: unknown 7491 1727203987.02102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203987.02204: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203987.02213: variable 'omit' from source: magic vars 7491 1727203987.02219: starting attempt loop 7491 1727203987.02222: running the handler 7491 1727203987.02256: handler run complete 7491 1727203987.02270: attempt loop complete, returning result 7491 1727203987.02274: _execute() done 7491 1727203987.02277: dumping result to json 7491 1727203987.02279: done dumping result, returning 7491 1727203987.02286: done running TaskExecutor() for managed-node3/TASK: TEST: I can configure an interface with auto_gateway disabled [0affcd87-79f5-0a4a-ad01-0000000000af] 7491 1727203987.02291: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000af 7491 1727203987.02380: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000af 7491 1727203987.02383: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: ################################################## 7491 1727203987.02447: no more pending results, returning what we have 7491 1727203987.02451: results queue empty 7491 1727203987.02452: checking for any_errors_fatal 7491 1727203987.02458: done checking for any_errors_fatal 7491 1727203987.02459: checking for max_fail_percentage 7491 1727203987.02461: done checking for max_fail_percentage 7491 1727203987.02462: checking to see if all hosts have failed and the running result is not ok 7491 1727203987.02463: done checking to see if all hosts have failed 7491 1727203987.02465: getting the remaining hosts for this loop 7491 1727203987.02468: done getting the remaining hosts for this loop 7491 1727203987.02472: getting the next task for host managed-node3 7491 1727203987.02477: done getting next task for host managed-node3 7491 1727203987.02481: ^ task is: TASK: Include the task 'manage_test_interface.yml' 7491 1727203987.02483: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203987.02487: getting variables 7491 1727203987.02488: in VariableManager get_vars() 7491 1727203987.02541: Calling all_inventory to load vars for managed-node3 7491 1727203987.02544: Calling groups_inventory to load vars for managed-node3 7491 1727203987.02545: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203987.02555: Calling all_plugins_play to load vars for managed-node3 7491 1727203987.02558: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203987.02560: Calling groups_plugins_play to load vars for managed-node3 7491 1727203987.03507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203987.04419: done with get_vars() 7491 1727203987.04436: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:87 Tuesday 24 September 2024 14:53:07 -0400 (0:00:00.037) 0:00:28.969 ***** 7491 1727203987.04530: entering _queue_task() for managed-node3/include_tasks 7491 1727203987.04762: worker is 1 (out of 1 available) 7491 1727203987.04777: exiting _queue_task() for managed-node3/include_tasks 7491 1727203987.04789: done queuing things up, now waiting for results queue to drain 7491 1727203987.04791: waiting for pending results... 7491 1727203987.04982: running TaskExecutor() for managed-node3/TASK: Include the task 'manage_test_interface.yml' 7491 1727203987.05053: in run() - task 0affcd87-79f5-0a4a-ad01-0000000000b0 7491 1727203987.05065: variable 'ansible_search_path' from source: unknown 7491 1727203987.05095: calling self._execute() 7491 1727203987.05174: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203987.05179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203987.05188: variable 'omit' from source: magic vars 7491 1727203987.05471: variable 'ansible_distribution_major_version' from source: facts 7491 1727203987.05482: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203987.05489: _execute() done 7491 1727203987.05492: dumping result to json 7491 1727203987.05494: done dumping result, returning 7491 1727203987.05501: done running TaskExecutor() for managed-node3/TASK: Include the task 'manage_test_interface.yml' [0affcd87-79f5-0a4a-ad01-0000000000b0] 7491 1727203987.05508: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000b0 7491 1727203987.05602: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000b0 7491 1727203987.05605: WORKER PROCESS EXITING 7491 1727203987.05637: no more pending results, returning what we have 7491 1727203987.05643: in VariableManager get_vars() 7491 1727203987.05701: Calling all_inventory to load vars for managed-node3 7491 1727203987.05705: Calling groups_inventory to load vars for managed-node3 7491 1727203987.05707: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203987.05727: Calling all_plugins_play to load vars for managed-node3 7491 1727203987.05730: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203987.05733: Calling groups_plugins_play to load vars for managed-node3 7491 1727203987.06534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203987.07472: done with get_vars() 7491 1727203987.07486: variable 'ansible_search_path' from source: unknown 7491 1727203987.07498: we have included files to process 7491 1727203987.07499: generating all_blocks data 7491 1727203987.07501: done generating all_blocks data 7491 1727203987.07505: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7491 1727203987.07506: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7491 1727203987.07507: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7491 1727203987.07783: in VariableManager get_vars() 7491 1727203987.07802: done with get_vars() 7491 1727203987.08253: done processing included file 7491 1727203987.08255: iterating over new_blocks loaded from include file 7491 1727203987.08256: in VariableManager get_vars() 7491 1727203987.08274: done with get_vars() 7491 1727203987.08275: filtering new block on tags 7491 1727203987.08296: done filtering new block on tags 7491 1727203987.08298: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node3 7491 1727203987.08301: extending task lists for all hosts with included blocks 7491 1727203987.12598: done extending task lists 7491 1727203987.12600: done processing included files 7491 1727203987.12601: results queue empty 7491 1727203987.12602: checking for any_errors_fatal 7491 1727203987.12606: done checking for any_errors_fatal 7491 1727203987.12607: checking for max_fail_percentage 7491 1727203987.12608: done checking for max_fail_percentage 7491 1727203987.12609: checking to see if all hosts have failed and the running result is not ok 7491 1727203987.12610: done checking to see if all hosts have failed 7491 1727203987.12611: getting the remaining hosts for this loop 7491 1727203987.12612: done getting the remaining hosts for this loop 7491 1727203987.12614: getting the next task for host managed-node3 7491 1727203987.12618: done getting next task for host managed-node3 7491 1727203987.12621: ^ task is: TASK: Ensure state in ["present", "absent"] 7491 1727203987.12623: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203987.12626: getting variables 7491 1727203987.12627: in VariableManager get_vars() 7491 1727203987.12650: Calling all_inventory to load vars for managed-node3 7491 1727203987.12653: Calling groups_inventory to load vars for managed-node3 7491 1727203987.12655: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203987.12662: Calling all_plugins_play to load vars for managed-node3 7491 1727203987.12666: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203987.12669: Calling groups_plugins_play to load vars for managed-node3 7491 1727203987.13916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203987.15614: done with get_vars() 7491 1727203987.15647: done getting variables 7491 1727203987.15696: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 14:53:07 -0400 (0:00:00.111) 0:00:29.081 ***** 7491 1727203987.15727: entering _queue_task() for managed-node3/fail 7491 1727203987.16068: worker is 1 (out of 1 available) 7491 1727203987.16083: exiting _queue_task() for managed-node3/fail 7491 1727203987.16097: done queuing things up, now waiting for results queue to drain 7491 1727203987.16099: waiting for pending results... 7491 1727203987.16426: running TaskExecutor() for managed-node3/TASK: Ensure state in ["present", "absent"] 7491 1727203987.16561: in run() - task 0affcd87-79f5-0a4a-ad01-0000000010aa 7491 1727203987.16585: variable 'ansible_search_path' from source: unknown 7491 1727203987.16593: variable 'ansible_search_path' from source: unknown 7491 1727203987.16635: calling self._execute() 7491 1727203987.16750: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203987.16769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203987.16785: variable 'omit' from source: magic vars 7491 1727203987.17203: variable 'ansible_distribution_major_version' from source: facts 7491 1727203987.17224: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203987.17374: variable 'state' from source: include params 7491 1727203987.17387: Evaluated conditional (state not in ["present", "absent"]): False 7491 1727203987.17394: when evaluation is False, skipping this task 7491 1727203987.17401: _execute() done 7491 1727203987.17407: dumping result to json 7491 1727203987.17421: done dumping result, returning 7491 1727203987.17431: done running TaskExecutor() for managed-node3/TASK: Ensure state in ["present", "absent"] [0affcd87-79f5-0a4a-ad01-0000000010aa] 7491 1727203987.17443: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000010aa skipping: [managed-node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 7491 1727203987.17607: no more pending results, returning what we have 7491 1727203987.17612: results queue empty 7491 1727203987.17614: checking for any_errors_fatal 7491 1727203987.17616: done checking for any_errors_fatal 7491 1727203987.17616: checking for max_fail_percentage 7491 1727203987.17618: done checking for max_fail_percentage 7491 1727203987.17619: checking to see if all hosts have failed and the running result is not ok 7491 1727203987.17621: done checking to see if all hosts have failed 7491 1727203987.17621: getting the remaining hosts for this loop 7491 1727203987.17624: done getting the remaining hosts for this loop 7491 1727203987.17628: getting the next task for host managed-node3 7491 1727203987.17636: done getting next task for host managed-node3 7491 1727203987.17639: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 7491 1727203987.17643: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203987.17647: getting variables 7491 1727203987.17650: in VariableManager get_vars() 7491 1727203987.17713: Calling all_inventory to load vars for managed-node3 7491 1727203987.17717: Calling groups_inventory to load vars for managed-node3 7491 1727203987.17719: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203987.17736: Calling all_plugins_play to load vars for managed-node3 7491 1727203987.17739: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203987.17742: Calling groups_plugins_play to load vars for managed-node3 7491 1727203987.18985: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000010aa 7491 1727203987.18989: WORKER PROCESS EXITING 7491 1727203987.19651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203987.21314: done with get_vars() 7491 1727203987.21348: done getting variables 7491 1727203987.21413: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 14:53:07 -0400 (0:00:00.057) 0:00:29.138 ***** 7491 1727203987.21446: entering _queue_task() for managed-node3/fail 7491 1727203987.21793: worker is 1 (out of 1 available) 7491 1727203987.21807: exiting _queue_task() for managed-node3/fail 7491 1727203987.21821: done queuing things up, now waiting for results queue to drain 7491 1727203987.21823: waiting for pending results... 7491 1727203987.22124: running TaskExecutor() for managed-node3/TASK: Ensure type in ["dummy", "tap", "veth"] 7491 1727203987.22238: in run() - task 0affcd87-79f5-0a4a-ad01-0000000010ab 7491 1727203987.22259: variable 'ansible_search_path' from source: unknown 7491 1727203987.22276: variable 'ansible_search_path' from source: unknown 7491 1727203987.22318: calling self._execute() 7491 1727203987.22430: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203987.22443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203987.22460: variable 'omit' from source: magic vars 7491 1727203987.22877: variable 'ansible_distribution_major_version' from source: facts 7491 1727203987.22888: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203987.22995: variable 'type' from source: play vars 7491 1727203987.23000: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 7491 1727203987.23003: when evaluation is False, skipping this task 7491 1727203987.23007: _execute() done 7491 1727203987.23009: dumping result to json 7491 1727203987.23011: done dumping result, returning 7491 1727203987.23019: done running TaskExecutor() for managed-node3/TASK: Ensure type in ["dummy", "tap", "veth"] [0affcd87-79f5-0a4a-ad01-0000000010ab] 7491 1727203987.23027: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000010ab 7491 1727203987.23117: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000010ab 7491 1727203987.23120: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 7491 1727203987.23176: no more pending results, returning what we have 7491 1727203987.23181: results queue empty 7491 1727203987.23182: checking for any_errors_fatal 7491 1727203987.23190: done checking for any_errors_fatal 7491 1727203987.23191: checking for max_fail_percentage 7491 1727203987.23193: done checking for max_fail_percentage 7491 1727203987.23194: checking to see if all hosts have failed and the running result is not ok 7491 1727203987.23196: done checking to see if all hosts have failed 7491 1727203987.23196: getting the remaining hosts for this loop 7491 1727203987.23199: done getting the remaining hosts for this loop 7491 1727203987.23202: getting the next task for host managed-node3 7491 1727203987.23209: done getting next task for host managed-node3 7491 1727203987.23211: ^ task is: TASK: Include the task 'show_interfaces.yml' 7491 1727203987.23214: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203987.23217: getting variables 7491 1727203987.23219: in VariableManager get_vars() 7491 1727203987.23277: Calling all_inventory to load vars for managed-node3 7491 1727203987.23280: Calling groups_inventory to load vars for managed-node3 7491 1727203987.23282: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203987.23293: Calling all_plugins_play to load vars for managed-node3 7491 1727203987.23296: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203987.23298: Calling groups_plugins_play to load vars for managed-node3 7491 1727203987.24121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203987.25572: done with get_vars() 7491 1727203987.25609: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 14:53:07 -0400 (0:00:00.042) 0:00:29.181 ***** 7491 1727203987.25728: entering _queue_task() for managed-node3/include_tasks 7491 1727203987.26063: worker is 1 (out of 1 available) 7491 1727203987.26078: exiting _queue_task() for managed-node3/include_tasks 7491 1727203987.26092: done queuing things up, now waiting for results queue to drain 7491 1727203987.26093: waiting for pending results... 7491 1727203987.26274: running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' 7491 1727203987.26347: in run() - task 0affcd87-79f5-0a4a-ad01-0000000010ac 7491 1727203987.26358: variable 'ansible_search_path' from source: unknown 7491 1727203987.26367: variable 'ansible_search_path' from source: unknown 7491 1727203987.26398: calling self._execute() 7491 1727203987.26484: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203987.26487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203987.26498: variable 'omit' from source: magic vars 7491 1727203987.26796: variable 'ansible_distribution_major_version' from source: facts 7491 1727203987.26806: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203987.26812: _execute() done 7491 1727203987.26815: dumping result to json 7491 1727203987.26821: done dumping result, returning 7491 1727203987.26827: done running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' [0affcd87-79f5-0a4a-ad01-0000000010ac] 7491 1727203987.26833: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000010ac 7491 1727203987.26921: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000010ac 7491 1727203987.26924: WORKER PROCESS EXITING 7491 1727203987.26956: no more pending results, returning what we have 7491 1727203987.26961: in VariableManager get_vars() 7491 1727203987.27020: Calling all_inventory to load vars for managed-node3 7491 1727203987.27024: Calling groups_inventory to load vars for managed-node3 7491 1727203987.27026: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203987.27046: Calling all_plugins_play to load vars for managed-node3 7491 1727203987.27049: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203987.27053: Calling groups_plugins_play to load vars for managed-node3 7491 1727203987.29209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203987.31578: done with get_vars() 7491 1727203987.31615: variable 'ansible_search_path' from source: unknown 7491 1727203987.31618: variable 'ansible_search_path' from source: unknown 7491 1727203987.31650: we have included files to process 7491 1727203987.31651: generating all_blocks data 7491 1727203987.31652: done generating all_blocks data 7491 1727203987.31656: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7491 1727203987.31657: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7491 1727203987.31658: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7491 1727203987.31741: in VariableManager get_vars() 7491 1727203987.31766: done with get_vars() 7491 1727203987.31850: done processing included file 7491 1727203987.31852: iterating over new_blocks loaded from include file 7491 1727203987.31854: in VariableManager get_vars() 7491 1727203987.31872: done with get_vars() 7491 1727203987.31873: filtering new block on tags 7491 1727203987.31885: done filtering new block on tags 7491 1727203987.31886: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node3 7491 1727203987.31890: extending task lists for all hosts with included blocks 7491 1727203987.32129: done extending task lists 7491 1727203987.32130: done processing included files 7491 1727203987.32131: results queue empty 7491 1727203987.32132: checking for any_errors_fatal 7491 1727203987.32135: done checking for any_errors_fatal 7491 1727203987.32135: checking for max_fail_percentage 7491 1727203987.32136: done checking for max_fail_percentage 7491 1727203987.32137: checking to see if all hosts have failed and the running result is not ok 7491 1727203987.32137: done checking to see if all hosts have failed 7491 1727203987.32138: getting the remaining hosts for this loop 7491 1727203987.32139: done getting the remaining hosts for this loop 7491 1727203987.32140: getting the next task for host managed-node3 7491 1727203987.32143: done getting next task for host managed-node3 7491 1727203987.32144: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 7491 1727203987.32146: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203987.32148: getting variables 7491 1727203987.32149: in VariableManager get_vars() 7491 1727203987.32161: Calling all_inventory to load vars for managed-node3 7491 1727203987.32162: Calling groups_inventory to load vars for managed-node3 7491 1727203987.32165: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203987.32170: Calling all_plugins_play to load vars for managed-node3 7491 1727203987.32171: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203987.32173: Calling groups_plugins_play to load vars for managed-node3 7491 1727203987.32954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203987.34378: done with get_vars() 7491 1727203987.34408: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:53:07 -0400 (0:00:00.087) 0:00:29.268 ***** 7491 1727203987.34489: entering _queue_task() for managed-node3/include_tasks 7491 1727203987.34904: worker is 1 (out of 1 available) 7491 1727203987.34919: exiting _queue_task() for managed-node3/include_tasks 7491 1727203987.34934: done queuing things up, now waiting for results queue to drain 7491 1727203987.34935: waiting for pending results... 7491 1727203987.35261: running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' 7491 1727203987.35424: in run() - task 0affcd87-79f5-0a4a-ad01-00000000130a 7491 1727203987.35454: variable 'ansible_search_path' from source: unknown 7491 1727203987.35465: variable 'ansible_search_path' from source: unknown 7491 1727203987.35580: calling self._execute() 7491 1727203987.35748: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203987.35753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203987.35756: variable 'omit' from source: magic vars 7491 1727203987.36060: variable 'ansible_distribution_major_version' from source: facts 7491 1727203987.36074: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203987.36080: _execute() done 7491 1727203987.36083: dumping result to json 7491 1727203987.36086: done dumping result, returning 7491 1727203987.36092: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' [0affcd87-79f5-0a4a-ad01-00000000130a] 7491 1727203987.36099: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000130a 7491 1727203987.36187: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000130a 7491 1727203987.36190: WORKER PROCESS EXITING 7491 1727203987.36221: no more pending results, returning what we have 7491 1727203987.36227: in VariableManager get_vars() 7491 1727203987.36297: Calling all_inventory to load vars for managed-node3 7491 1727203987.36302: Calling groups_inventory to load vars for managed-node3 7491 1727203987.36305: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203987.36318: Calling all_plugins_play to load vars for managed-node3 7491 1727203987.36321: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203987.36324: Calling groups_plugins_play to load vars for managed-node3 7491 1727203987.37154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203987.38756: done with get_vars() 7491 1727203987.38781: variable 'ansible_search_path' from source: unknown 7491 1727203987.38782: variable 'ansible_search_path' from source: unknown 7491 1727203987.38827: we have included files to process 7491 1727203987.38828: generating all_blocks data 7491 1727203987.38829: done generating all_blocks data 7491 1727203987.38830: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7491 1727203987.38831: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7491 1727203987.38837: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7491 1727203987.39043: done processing included file 7491 1727203987.39045: iterating over new_blocks loaded from include file 7491 1727203987.39046: in VariableManager get_vars() 7491 1727203987.39069: done with get_vars() 7491 1727203987.39070: filtering new block on tags 7491 1727203987.39083: done filtering new block on tags 7491 1727203987.39084: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node3 7491 1727203987.39088: extending task lists for all hosts with included blocks 7491 1727203987.39185: done extending task lists 7491 1727203987.39186: done processing included files 7491 1727203987.39186: results queue empty 7491 1727203987.39187: checking for any_errors_fatal 7491 1727203987.39189: done checking for any_errors_fatal 7491 1727203987.39190: checking for max_fail_percentage 7491 1727203987.39190: done checking for max_fail_percentage 7491 1727203987.39191: checking to see if all hosts have failed and the running result is not ok 7491 1727203987.39192: done checking to see if all hosts have failed 7491 1727203987.39192: getting the remaining hosts for this loop 7491 1727203987.39193: done getting the remaining hosts for this loop 7491 1727203987.39195: getting the next task for host managed-node3 7491 1727203987.39198: done getting next task for host managed-node3 7491 1727203987.39200: ^ task is: TASK: Gather current interface info 7491 1727203987.39202: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203987.39204: getting variables 7491 1727203987.39204: in VariableManager get_vars() 7491 1727203987.39219: Calling all_inventory to load vars for managed-node3 7491 1727203987.39221: Calling groups_inventory to load vars for managed-node3 7491 1727203987.39222: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203987.39226: Calling all_plugins_play to load vars for managed-node3 7491 1727203987.39228: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203987.39230: Calling groups_plugins_play to load vars for managed-node3 7491 1727203987.39999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203987.40921: done with get_vars() 7491 1727203987.40940: done getting variables 7491 1727203987.40976: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:53:07 -0400 (0:00:00.065) 0:00:29.333 ***** 7491 1727203987.41000: entering _queue_task() for managed-node3/command 7491 1727203987.41250: worker is 1 (out of 1 available) 7491 1727203987.41265: exiting _queue_task() for managed-node3/command 7491 1727203987.41279: done queuing things up, now waiting for results queue to drain 7491 1727203987.41280: waiting for pending results... 7491 1727203987.41468: running TaskExecutor() for managed-node3/TASK: Gather current interface info 7491 1727203987.41553: in run() - task 0affcd87-79f5-0a4a-ad01-000000001341 7491 1727203987.41570: variable 'ansible_search_path' from source: unknown 7491 1727203987.41574: variable 'ansible_search_path' from source: unknown 7491 1727203987.41603: calling self._execute() 7491 1727203987.41689: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203987.41693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203987.41702: variable 'omit' from source: magic vars 7491 1727203987.41984: variable 'ansible_distribution_major_version' from source: facts 7491 1727203987.41997: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203987.42002: variable 'omit' from source: magic vars 7491 1727203987.42043: variable 'omit' from source: magic vars 7491 1727203987.42070: variable 'omit' from source: magic vars 7491 1727203987.42106: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203987.42136: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203987.42155: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203987.42170: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203987.42181: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203987.42208: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203987.42212: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203987.42214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203987.42285: Set connection var ansible_timeout to 10 7491 1727203987.42291: Set connection var ansible_pipelining to False 7491 1727203987.42296: Set connection var ansible_shell_type to sh 7491 1727203987.42305: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203987.42308: Set connection var ansible_shell_executable to /bin/sh 7491 1727203987.42316: Set connection var ansible_connection to ssh 7491 1727203987.42334: variable 'ansible_shell_executable' from source: unknown 7491 1727203987.42337: variable 'ansible_connection' from source: unknown 7491 1727203987.42340: variable 'ansible_module_compression' from source: unknown 7491 1727203987.42343: variable 'ansible_shell_type' from source: unknown 7491 1727203987.42345: variable 'ansible_shell_executable' from source: unknown 7491 1727203987.42348: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203987.42350: variable 'ansible_pipelining' from source: unknown 7491 1727203987.42352: variable 'ansible_timeout' from source: unknown 7491 1727203987.42357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203987.42463: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203987.42474: variable 'omit' from source: magic vars 7491 1727203987.42479: starting attempt loop 7491 1727203987.42481: running the handler 7491 1727203987.42495: _low_level_execute_command(): starting 7491 1727203987.42502: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203987.43041: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203987.43052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203987.43084: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203987.43098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203987.43109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203987.43159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203987.43167: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203987.43182: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203987.43240: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203987.44854: stdout chunk (state=3): >>>/root <<< 7491 1727203987.44951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203987.45012: stderr chunk (state=3): >>><<< 7491 1727203987.45016: stdout chunk (state=3): >>><<< 7491 1727203987.45039: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203987.45052: _low_level_execute_command(): starting 7491 1727203987.45058: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203987.4503994-8933-208314426405738 `" && echo ansible-tmp-1727203987.4503994-8933-208314426405738="` echo /root/.ansible/tmp/ansible-tmp-1727203987.4503994-8933-208314426405738 `" ) && sleep 0' 7491 1727203987.45538: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203987.45545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203987.45579: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203987.45583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7491 1727203987.45604: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203987.45655: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203987.45671: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203987.45725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203987.47543: stdout chunk (state=3): >>>ansible-tmp-1727203987.4503994-8933-208314426405738=/root/.ansible/tmp/ansible-tmp-1727203987.4503994-8933-208314426405738 <<< 7491 1727203987.47658: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203987.47721: stderr chunk (state=3): >>><<< 7491 1727203987.47724: stdout chunk (state=3): >>><<< 7491 1727203987.47738: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203987.4503994-8933-208314426405738=/root/.ansible/tmp/ansible-tmp-1727203987.4503994-8933-208314426405738 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203987.47767: variable 'ansible_module_compression' from source: unknown 7491 1727203987.47818: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7491 1727203987.47849: variable 'ansible_facts' from source: unknown 7491 1727203987.47904: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203987.4503994-8933-208314426405738/AnsiballZ_command.py 7491 1727203987.48014: Sending initial data 7491 1727203987.48020: Sent initial data (154 bytes) 7491 1727203987.48721: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203987.48725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203987.48758: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203987.48769: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203987.48782: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 7491 1727203987.48787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203987.48797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203987.48802: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203987.48855: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203987.48872: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203987.48915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203987.50602: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203987.50640: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203987.50681: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpuyrx23tm /root/.ansible/tmp/ansible-tmp-1727203987.4503994-8933-208314426405738/AnsiballZ_command.py <<< 7491 1727203987.50720: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203987.51512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203987.51632: stderr chunk (state=3): >>><<< 7491 1727203987.51636: stdout chunk (state=3): >>><<< 7491 1727203987.51654: done transferring module to remote 7491 1727203987.51665: _low_level_execute_command(): starting 7491 1727203987.51671: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203987.4503994-8933-208314426405738/ /root/.ansible/tmp/ansible-tmp-1727203987.4503994-8933-208314426405738/AnsiballZ_command.py && sleep 0' 7491 1727203987.52135: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203987.52140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203987.52175: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203987.52187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203987.52198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203987.52249: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203987.52252: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203987.52269: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203987.52321: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203987.54090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203987.54101: stdout chunk (state=3): >>><<< 7491 1727203987.54113: stderr chunk (state=3): >>><<< 7491 1727203987.54136: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203987.54151: _low_level_execute_command(): starting 7491 1727203987.54161: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203987.4503994-8933-208314426405738/AnsiballZ_command.py && sleep 0' 7491 1727203987.54937: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203987.54958: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203987.55007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203987.55010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203987.55048: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203987.55052: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203987.55054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203987.55127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203987.55133: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203987.55198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203987.68438: stdout chunk (state=3): >>> {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:53:07.680452", "end": "2024-09-24 14:53:07.683628", "delta": "0:00:00.003176", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7491 1727203987.69715: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203987.69723: stdout chunk (state=3): >>><<< 7491 1727203987.69726: stderr chunk (state=3): >>><<< 7491 1727203987.69883: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:53:07.680452", "end": "2024-09-24 14:53:07.683628", "delta": "0:00:00.003176", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203987.69887: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203987.4503994-8933-208314426405738/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203987.69890: _low_level_execute_command(): starting 7491 1727203987.69893: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203987.4503994-8933-208314426405738/ > /dev/null 2>&1 && sleep 0' 7491 1727203987.71261: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203987.71986: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203987.72003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203987.72024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203987.72072: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203987.72085: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203987.72097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203987.72114: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203987.72128: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203987.72138: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203987.72149: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203987.72169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203987.72184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203987.72196: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203987.72206: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203987.72221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203987.72305: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203987.72324: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203987.72338: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203987.72412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203987.74273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203987.74277: stdout chunk (state=3): >>><<< 7491 1727203987.74280: stderr chunk (state=3): >>><<< 7491 1727203987.74370: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203987.74373: handler run complete 7491 1727203987.74376: Evaluated conditional (False): False 7491 1727203987.74378: attempt loop complete, returning result 7491 1727203987.74380: _execute() done 7491 1727203987.74382: dumping result to json 7491 1727203987.74384: done dumping result, returning 7491 1727203987.74386: done running TaskExecutor() for managed-node3/TASK: Gather current interface info [0affcd87-79f5-0a4a-ad01-000000001341] 7491 1727203987.74471: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001341 7491 1727203987.74553: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001341 7491 1727203987.74557: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003176", "end": "2024-09-24 14:53:07.683628", "rc": 0, "start": "2024-09-24 14:53:07.680452" } STDOUT: eth0 lo 7491 1727203987.74634: no more pending results, returning what we have 7491 1727203987.74638: results queue empty 7491 1727203987.74640: checking for any_errors_fatal 7491 1727203987.74641: done checking for any_errors_fatal 7491 1727203987.74642: checking for max_fail_percentage 7491 1727203987.74643: done checking for max_fail_percentage 7491 1727203987.74644: checking to see if all hosts have failed and the running result is not ok 7491 1727203987.74645: done checking to see if all hosts have failed 7491 1727203987.74646: getting the remaining hosts for this loop 7491 1727203987.74648: done getting the remaining hosts for this loop 7491 1727203987.74652: getting the next task for host managed-node3 7491 1727203987.74659: done getting next task for host managed-node3 7491 1727203987.74661: ^ task is: TASK: Set current_interfaces 7491 1727203987.74669: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203987.74673: getting variables 7491 1727203987.74675: in VariableManager get_vars() 7491 1727203987.74728: Calling all_inventory to load vars for managed-node3 7491 1727203987.74731: Calling groups_inventory to load vars for managed-node3 7491 1727203987.74733: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203987.74745: Calling all_plugins_play to load vars for managed-node3 7491 1727203987.74747: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203987.74750: Calling groups_plugins_play to load vars for managed-node3 7491 1727203987.77501: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203987.81184: done with get_vars() 7491 1727203987.81337: done getting variables 7491 1727203987.81407: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:53:07 -0400 (0:00:00.404) 0:00:29.738 ***** 7491 1727203987.81561: entering _queue_task() for managed-node3/set_fact 7491 1727203987.82237: worker is 1 (out of 1 available) 7491 1727203987.82250: exiting _queue_task() for managed-node3/set_fact 7491 1727203987.82314: done queuing things up, now waiting for results queue to drain 7491 1727203987.82316: waiting for pending results... 7491 1727203987.83055: running TaskExecutor() for managed-node3/TASK: Set current_interfaces 7491 1727203987.83198: in run() - task 0affcd87-79f5-0a4a-ad01-000000001342 7491 1727203987.83890: variable 'ansible_search_path' from source: unknown 7491 1727203987.83899: variable 'ansible_search_path' from source: unknown 7491 1727203987.83946: calling self._execute() 7491 1727203987.84063: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203987.84080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203987.84094: variable 'omit' from source: magic vars 7491 1727203987.84469: variable 'ansible_distribution_major_version' from source: facts 7491 1727203987.84613: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203987.84629: variable 'omit' from source: magic vars 7491 1727203987.84692: variable 'omit' from source: magic vars 7491 1727203987.85013: variable '_current_interfaces' from source: set_fact 7491 1727203987.85081: variable 'omit' from source: magic vars 7491 1727203987.85138: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203987.85178: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203987.85208: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203987.85238: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203987.85257: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203987.85294: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203987.85302: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203987.85309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203987.85416: Set connection var ansible_timeout to 10 7491 1727203987.85429: Set connection var ansible_pipelining to False 7491 1727203987.85443: Set connection var ansible_shell_type to sh 7491 1727203987.85452: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203987.85463: Set connection var ansible_shell_executable to /bin/sh 7491 1727203987.85474: Set connection var ansible_connection to ssh 7491 1727203987.85500: variable 'ansible_shell_executable' from source: unknown 7491 1727203987.85507: variable 'ansible_connection' from source: unknown 7491 1727203987.85516: variable 'ansible_module_compression' from source: unknown 7491 1727203987.85522: variable 'ansible_shell_type' from source: unknown 7491 1727203987.85528: variable 'ansible_shell_executable' from source: unknown 7491 1727203987.85533: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203987.85540: variable 'ansible_pipelining' from source: unknown 7491 1727203987.85550: variable 'ansible_timeout' from source: unknown 7491 1727203987.85557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203987.85703: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203987.85720: variable 'omit' from source: magic vars 7491 1727203987.85729: starting attempt loop 7491 1727203987.85735: running the handler 7491 1727203987.85749: handler run complete 7491 1727203987.85766: attempt loop complete, returning result 7491 1727203987.85774: _execute() done 7491 1727203987.85779: dumping result to json 7491 1727203987.85787: done dumping result, returning 7491 1727203987.85797: done running TaskExecutor() for managed-node3/TASK: Set current_interfaces [0affcd87-79f5-0a4a-ad01-000000001342] 7491 1727203987.85806: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001342 ok: [managed-node3] => { "ansible_facts": { "current_interfaces": [ "eth0", "lo" ] }, "changed": false } 7491 1727203987.85967: no more pending results, returning what we have 7491 1727203987.85971: results queue empty 7491 1727203987.85973: checking for any_errors_fatal 7491 1727203987.85985: done checking for any_errors_fatal 7491 1727203987.85986: checking for max_fail_percentage 7491 1727203987.85988: done checking for max_fail_percentage 7491 1727203987.85989: checking to see if all hosts have failed and the running result is not ok 7491 1727203987.85990: done checking to see if all hosts have failed 7491 1727203987.85991: getting the remaining hosts for this loop 7491 1727203987.85994: done getting the remaining hosts for this loop 7491 1727203987.85998: getting the next task for host managed-node3 7491 1727203987.86008: done getting next task for host managed-node3 7491 1727203987.86011: ^ task is: TASK: Show current_interfaces 7491 1727203987.86016: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203987.86020: getting variables 7491 1727203987.86022: in VariableManager get_vars() 7491 1727203987.86084: Calling all_inventory to load vars for managed-node3 7491 1727203987.86087: Calling groups_inventory to load vars for managed-node3 7491 1727203987.86090: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203987.86103: Calling all_plugins_play to load vars for managed-node3 7491 1727203987.86106: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203987.86110: Calling groups_plugins_play to load vars for managed-node3 7491 1727203987.87445: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001342 7491 1727203987.87449: WORKER PROCESS EXITING 7491 1727203987.88384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203987.90272: done with get_vars() 7491 1727203987.90306: done getting variables 7491 1727203987.90371: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:53:07 -0400 (0:00:00.088) 0:00:29.827 ***** 7491 1727203987.90407: entering _queue_task() for managed-node3/debug 7491 1727203987.90735: worker is 1 (out of 1 available) 7491 1727203987.90784: exiting _queue_task() for managed-node3/debug 7491 1727203987.90801: done queuing things up, now waiting for results queue to drain 7491 1727203987.90802: waiting for pending results... 7491 1727203987.91522: running TaskExecutor() for managed-node3/TASK: Show current_interfaces 7491 1727203987.91646: in run() - task 0affcd87-79f5-0a4a-ad01-00000000130b 7491 1727203987.91668: variable 'ansible_search_path' from source: unknown 7491 1727203987.91675: variable 'ansible_search_path' from source: unknown 7491 1727203987.91719: calling self._execute() 7491 1727203987.91827: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203987.91839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203987.91854: variable 'omit' from source: magic vars 7491 1727203987.92280: variable 'ansible_distribution_major_version' from source: facts 7491 1727203987.92297: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203987.92307: variable 'omit' from source: magic vars 7491 1727203987.92360: variable 'omit' from source: magic vars 7491 1727203987.92465: variable 'current_interfaces' from source: set_fact 7491 1727203987.92504: variable 'omit' from source: magic vars 7491 1727203987.92548: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203987.92593: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203987.92619: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203987.92641: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203987.92658: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203987.92699: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203987.92707: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203987.92715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203987.92812: Set connection var ansible_timeout to 10 7491 1727203987.92824: Set connection var ansible_pipelining to False 7491 1727203987.92832: Set connection var ansible_shell_type to sh 7491 1727203987.92840: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203987.92849: Set connection var ansible_shell_executable to /bin/sh 7491 1727203987.92857: Set connection var ansible_connection to ssh 7491 1727203987.92885: variable 'ansible_shell_executable' from source: unknown 7491 1727203987.92893: variable 'ansible_connection' from source: unknown 7491 1727203987.92901: variable 'ansible_module_compression' from source: unknown 7491 1727203987.92910: variable 'ansible_shell_type' from source: unknown 7491 1727203987.92917: variable 'ansible_shell_executable' from source: unknown 7491 1727203987.92924: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203987.92931: variable 'ansible_pipelining' from source: unknown 7491 1727203987.92938: variable 'ansible_timeout' from source: unknown 7491 1727203987.92945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203987.93119: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203987.93138: variable 'omit' from source: magic vars 7491 1727203987.93149: starting attempt loop 7491 1727203987.93156: running the handler 7491 1727203987.93209: handler run complete 7491 1727203987.93230: attempt loop complete, returning result 7491 1727203987.93240: _execute() done 7491 1727203987.93247: dumping result to json 7491 1727203987.93255: done dumping result, returning 7491 1727203987.93269: done running TaskExecutor() for managed-node3/TASK: Show current_interfaces [0affcd87-79f5-0a4a-ad01-00000000130b] 7491 1727203987.93281: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000130b ok: [managed-node3] => {} MSG: current_interfaces: ['eth0', 'lo'] 7491 1727203987.93435: no more pending results, returning what we have 7491 1727203987.93439: results queue empty 7491 1727203987.93440: checking for any_errors_fatal 7491 1727203987.93446: done checking for any_errors_fatal 7491 1727203987.93447: checking for max_fail_percentage 7491 1727203987.93449: done checking for max_fail_percentage 7491 1727203987.93450: checking to see if all hosts have failed and the running result is not ok 7491 1727203987.93452: done checking to see if all hosts have failed 7491 1727203987.93452: getting the remaining hosts for this loop 7491 1727203987.93455: done getting the remaining hosts for this loop 7491 1727203987.93459: getting the next task for host managed-node3 7491 1727203987.93470: done getting next task for host managed-node3 7491 1727203987.93474: ^ task is: TASK: Install iproute 7491 1727203987.93477: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203987.93482: getting variables 7491 1727203987.93484: in VariableManager get_vars() 7491 1727203987.93543: Calling all_inventory to load vars for managed-node3 7491 1727203987.93547: Calling groups_inventory to load vars for managed-node3 7491 1727203987.93549: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203987.93562: Calling all_plugins_play to load vars for managed-node3 7491 1727203987.93566: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203987.93570: Calling groups_plugins_play to load vars for managed-node3 7491 1727203987.94732: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000130b 7491 1727203987.94736: WORKER PROCESS EXITING 7491 1727203987.95485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203987.97657: done with get_vars() 7491 1727203987.97693: done getting variables 7491 1727203987.98682: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 14:53:07 -0400 (0:00:00.083) 0:00:29.911 ***** 7491 1727203987.98716: entering _queue_task() for managed-node3/package 7491 1727203987.99096: worker is 1 (out of 1 available) 7491 1727203987.99110: exiting _queue_task() for managed-node3/package 7491 1727203987.99122: done queuing things up, now waiting for results queue to drain 7491 1727203987.99123: waiting for pending results... 7491 1727203987.99440: running TaskExecutor() for managed-node3/TASK: Install iproute 7491 1727203987.99554: in run() - task 0affcd87-79f5-0a4a-ad01-0000000010ad 7491 1727203987.99581: variable 'ansible_search_path' from source: unknown 7491 1727203987.99590: variable 'ansible_search_path' from source: unknown 7491 1727203987.99631: calling self._execute() 7491 1727203987.99734: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203987.99745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203987.99758: variable 'omit' from source: magic vars 7491 1727203988.00170: variable 'ansible_distribution_major_version' from source: facts 7491 1727203988.00188: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203988.00199: variable 'omit' from source: magic vars 7491 1727203988.00249: variable 'omit' from source: magic vars 7491 1727203988.00452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203988.03025: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203988.03112: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203988.03162: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203988.03211: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203988.03248: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203988.03358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203988.03422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203988.03459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203988.03516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203988.03538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203988.03710: variable '__network_is_ostree' from source: set_fact 7491 1727203988.03723: variable 'omit' from source: magic vars 7491 1727203988.03756: variable 'omit' from source: magic vars 7491 1727203988.03950: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203988.03986: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203988.04015: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203988.04044: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203988.04059: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203988.04097: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203988.04105: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203988.04113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203988.04227: Set connection var ansible_timeout to 10 7491 1727203988.04246: Set connection var ansible_pipelining to False 7491 1727203988.04278: Set connection var ansible_shell_type to sh 7491 1727203988.04290: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203988.04303: Set connection var ansible_shell_executable to /bin/sh 7491 1727203988.04313: Set connection var ansible_connection to ssh 7491 1727203988.04345: variable 'ansible_shell_executable' from source: unknown 7491 1727203988.04357: variable 'ansible_connection' from source: unknown 7491 1727203988.04366: variable 'ansible_module_compression' from source: unknown 7491 1727203988.04373: variable 'ansible_shell_type' from source: unknown 7491 1727203988.04384: variable 'ansible_shell_executable' from source: unknown 7491 1727203988.04390: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203988.04398: variable 'ansible_pipelining' from source: unknown 7491 1727203988.04405: variable 'ansible_timeout' from source: unknown 7491 1727203988.04412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203988.04531: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203988.04548: variable 'omit' from source: magic vars 7491 1727203988.04559: starting attempt loop 7491 1727203988.04571: running the handler 7491 1727203988.04582: variable 'ansible_facts' from source: unknown 7491 1727203988.04590: variable 'ansible_facts' from source: unknown 7491 1727203988.04633: _low_level_execute_command(): starting 7491 1727203988.04646: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203988.05405: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203988.05421: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203988.05448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203988.05481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203988.05531: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203988.05561: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203988.05595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203988.05622: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203988.05635: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203988.05647: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203988.05676: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203988.05692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203988.05708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203988.05721: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203988.05739: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203988.05754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203988.05837: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203988.05893: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203988.05929: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203988.06032: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203988.07663: stdout chunk (state=3): >>>/root <<< 7491 1727203988.07768: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203988.07870: stderr chunk (state=3): >>><<< 7491 1727203988.07885: stdout chunk (state=3): >>><<< 7491 1727203988.07970: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203988.07974: _low_level_execute_command(): starting 7491 1727203988.07979: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203988.0793386-8953-36114176051610 `" && echo ansible-tmp-1727203988.0793386-8953-36114176051610="` echo /root/.ansible/tmp/ansible-tmp-1727203988.0793386-8953-36114176051610 `" ) && sleep 0' 7491 1727203988.08722: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203988.08737: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203988.08756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203988.08779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203988.08843: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203988.08867: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203988.08889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203988.08908: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203988.08933: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203988.08946: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203988.08970: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203988.08991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203988.09013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203988.09030: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203988.09047: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203988.09062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203988.09517: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203988.09521: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203988.09534: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203988.09613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203988.11450: stdout chunk (state=3): >>>ansible-tmp-1727203988.0793386-8953-36114176051610=/root/.ansible/tmp/ansible-tmp-1727203988.0793386-8953-36114176051610 <<< 7491 1727203988.11567: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203988.11652: stderr chunk (state=3): >>><<< 7491 1727203988.11668: stdout chunk (state=3): >>><<< 7491 1727203988.11874: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203988.0793386-8953-36114176051610=/root/.ansible/tmp/ansible-tmp-1727203988.0793386-8953-36114176051610 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203988.11878: variable 'ansible_module_compression' from source: unknown 7491 1727203988.11880: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 7491 1727203988.11882: variable 'ansible_facts' from source: unknown 7491 1727203988.11955: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203988.0793386-8953-36114176051610/AnsiballZ_dnf.py 7491 1727203988.12143: Sending initial data 7491 1727203988.12146: Sent initial data (149 bytes) 7491 1727203988.13230: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203988.13246: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203988.13261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203988.13283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203988.13334: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203988.13347: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203988.13362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203988.13383: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203988.13396: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203988.13413: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203988.13427: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203988.13441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203988.13457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203988.13478: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203988.13491: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203988.13506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203988.13590: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203988.13615: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203988.13638: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203988.13714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203988.15400: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203988.15439: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203988.15494: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmp2_2d2t4q /root/.ansible/tmp/ansible-tmp-1727203988.0793386-8953-36114176051610/AnsiballZ_dnf.py <<< 7491 1727203988.15525: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203988.17003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203988.17246: stderr chunk (state=3): >>><<< 7491 1727203988.17250: stdout chunk (state=3): >>><<< 7491 1727203988.17275: done transferring module to remote 7491 1727203988.17284: _low_level_execute_command(): starting 7491 1727203988.17289: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203988.0793386-8953-36114176051610/ /root/.ansible/tmp/ansible-tmp-1727203988.0793386-8953-36114176051610/AnsiballZ_dnf.py && sleep 0' 7491 1727203988.18026: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203988.18042: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203988.18058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203988.18083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203988.18130: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203988.18146: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203988.18160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203988.18183: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203988.18196: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203988.18207: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203988.18220: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203988.18236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203988.18257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203988.18273: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203988.18285: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203988.18299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203988.18383: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203988.18408: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203988.18428: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203988.18510: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203988.20159: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203988.20234: stderr chunk (state=3): >>><<< 7491 1727203988.20237: stdout chunk (state=3): >>><<< 7491 1727203988.20248: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203988.20251: _low_level_execute_command(): starting 7491 1727203988.20257: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203988.0793386-8953-36114176051610/AnsiballZ_dnf.py && sleep 0' 7491 1727203988.20883: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203988.20892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203988.20902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203988.20918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203988.20956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203988.20960: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203988.20973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203988.20988: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203988.21091: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203988.21094: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203988.21096: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203988.21099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203988.21100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203988.21102: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203988.21104: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203988.21106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203988.21158: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203988.21183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203988.21191: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203988.21373: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203989.11906: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 7491 1727203989.15792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203989.15853: stderr chunk (state=3): >>><<< 7491 1727203989.15856: stdout chunk (state=3): >>><<< 7491 1727203989.15874: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203989.15910: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203988.0793386-8953-36114176051610/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203989.15919: _low_level_execute_command(): starting 7491 1727203989.15922: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203988.0793386-8953-36114176051610/ > /dev/null 2>&1 && sleep 0' 7491 1727203989.16406: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203989.16409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203989.16431: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203989.16443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203989.16452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203989.16501: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203989.16509: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203989.16572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203989.18316: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203989.18370: stderr chunk (state=3): >>><<< 7491 1727203989.18374: stdout chunk (state=3): >>><<< 7491 1727203989.18388: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203989.18395: handler run complete 7491 1727203989.18514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727203989.18646: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727203989.18678: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727203989.18702: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727203989.18738: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727203989.18798: variable '__install_status' from source: set_fact 7491 1727203989.18813: Evaluated conditional (__install_status is success): True 7491 1727203989.18826: attempt loop complete, returning result 7491 1727203989.18829: _execute() done 7491 1727203989.18831: dumping result to json 7491 1727203989.18837: done dumping result, returning 7491 1727203989.18845: done running TaskExecutor() for managed-node3/TASK: Install iproute [0affcd87-79f5-0a4a-ad01-0000000010ad] 7491 1727203989.18849: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000010ad 7491 1727203989.18949: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000010ad 7491 1727203989.18952: WORKER PROCESS EXITING ok: [managed-node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 7491 1727203989.19033: no more pending results, returning what we have 7491 1727203989.19037: results queue empty 7491 1727203989.19038: checking for any_errors_fatal 7491 1727203989.19045: done checking for any_errors_fatal 7491 1727203989.19046: checking for max_fail_percentage 7491 1727203989.19047: done checking for max_fail_percentage 7491 1727203989.19048: checking to see if all hosts have failed and the running result is not ok 7491 1727203989.19049: done checking to see if all hosts have failed 7491 1727203989.19050: getting the remaining hosts for this loop 7491 1727203989.19052: done getting the remaining hosts for this loop 7491 1727203989.19056: getting the next task for host managed-node3 7491 1727203989.19061: done getting next task for host managed-node3 7491 1727203989.19065: ^ task is: TASK: Create veth interface {{ interface }} 7491 1727203989.19068: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203989.19071: getting variables 7491 1727203989.19073: in VariableManager get_vars() 7491 1727203989.19123: Calling all_inventory to load vars for managed-node3 7491 1727203989.19126: Calling groups_inventory to load vars for managed-node3 7491 1727203989.19128: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203989.19138: Calling all_plugins_play to load vars for managed-node3 7491 1727203989.19141: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203989.19144: Calling groups_plugins_play to load vars for managed-node3 7491 1727203989.19981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203989.20905: done with get_vars() 7491 1727203989.20925: done getting variables 7491 1727203989.20973: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203989.21063: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 14:53:09 -0400 (0:00:01.223) 0:00:31.134 ***** 7491 1727203989.21087: entering _queue_task() for managed-node3/command 7491 1727203989.21315: worker is 1 (out of 1 available) 7491 1727203989.21332: exiting _queue_task() for managed-node3/command 7491 1727203989.21347: done queuing things up, now waiting for results queue to drain 7491 1727203989.21349: waiting for pending results... 7491 1727203989.21529: running TaskExecutor() for managed-node3/TASK: Create veth interface veth0 7491 1727203989.21593: in run() - task 0affcd87-79f5-0a4a-ad01-0000000010ae 7491 1727203989.21609: variable 'ansible_search_path' from source: unknown 7491 1727203989.21613: variable 'ansible_search_path' from source: unknown 7491 1727203989.21813: variable 'interface' from source: play vars 7491 1727203989.21873: variable 'interface' from source: play vars 7491 1727203989.21928: variable 'interface' from source: play vars 7491 1727203989.22040: Loaded config def from plugin (lookup/items) 7491 1727203989.22044: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 7491 1727203989.22066: variable 'omit' from source: magic vars 7491 1727203989.22169: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203989.22177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203989.22187: variable 'omit' from source: magic vars 7491 1727203989.22349: variable 'ansible_distribution_major_version' from source: facts 7491 1727203989.22355: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203989.22492: variable 'type' from source: play vars 7491 1727203989.22497: variable 'state' from source: include params 7491 1727203989.22499: variable 'interface' from source: play vars 7491 1727203989.22502: variable 'current_interfaces' from source: set_fact 7491 1727203989.22510: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 7491 1727203989.22518: variable 'omit' from source: magic vars 7491 1727203989.22545: variable 'omit' from source: magic vars 7491 1727203989.22575: variable 'item' from source: unknown 7491 1727203989.22625: variable 'item' from source: unknown 7491 1727203989.22638: variable 'omit' from source: magic vars 7491 1727203989.22666: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203989.22687: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203989.22706: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203989.22719: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203989.22726: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203989.22751: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203989.22754: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203989.22757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203989.22833: Set connection var ansible_timeout to 10 7491 1727203989.22842: Set connection var ansible_pipelining to False 7491 1727203989.22847: Set connection var ansible_shell_type to sh 7491 1727203989.22852: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203989.22859: Set connection var ansible_shell_executable to /bin/sh 7491 1727203989.22866: Set connection var ansible_connection to ssh 7491 1727203989.22881: variable 'ansible_shell_executable' from source: unknown 7491 1727203989.22884: variable 'ansible_connection' from source: unknown 7491 1727203989.22886: variable 'ansible_module_compression' from source: unknown 7491 1727203989.22889: variable 'ansible_shell_type' from source: unknown 7491 1727203989.22891: variable 'ansible_shell_executable' from source: unknown 7491 1727203989.22893: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203989.22898: variable 'ansible_pipelining' from source: unknown 7491 1727203989.22900: variable 'ansible_timeout' from source: unknown 7491 1727203989.22904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203989.23006: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203989.23015: variable 'omit' from source: magic vars 7491 1727203989.23022: starting attempt loop 7491 1727203989.23029: running the handler 7491 1727203989.23039: _low_level_execute_command(): starting 7491 1727203989.23050: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203989.23585: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203989.23589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203989.23622: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203989.23625: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203989.23628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203989.23671: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203989.23687: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203989.23736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203989.25256: stdout chunk (state=3): >>>/root <<< 7491 1727203989.25354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203989.25410: stderr chunk (state=3): >>><<< 7491 1727203989.25414: stdout chunk (state=3): >>><<< 7491 1727203989.25438: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203989.25448: _low_level_execute_command(): starting 7491 1727203989.25459: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203989.2543745-8990-35204977443859 `" && echo ansible-tmp-1727203989.2543745-8990-35204977443859="` echo /root/.ansible/tmp/ansible-tmp-1727203989.2543745-8990-35204977443859 `" ) && sleep 0' 7491 1727203989.25913: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203989.25929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203989.25946: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203989.25960: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203989.26006: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203989.26028: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203989.26071: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203989.27843: stdout chunk (state=3): >>>ansible-tmp-1727203989.2543745-8990-35204977443859=/root/.ansible/tmp/ansible-tmp-1727203989.2543745-8990-35204977443859 <<< 7491 1727203989.27954: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203989.28009: stderr chunk (state=3): >>><<< 7491 1727203989.28012: stdout chunk (state=3): >>><<< 7491 1727203989.28030: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203989.2543745-8990-35204977443859=/root/.ansible/tmp/ansible-tmp-1727203989.2543745-8990-35204977443859 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203989.28057: variable 'ansible_module_compression' from source: unknown 7491 1727203989.28101: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7491 1727203989.28132: variable 'ansible_facts' from source: unknown 7491 1727203989.28198: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203989.2543745-8990-35204977443859/AnsiballZ_command.py 7491 1727203989.28310: Sending initial data 7491 1727203989.28314: Sent initial data (153 bytes) 7491 1727203989.28993: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203989.28997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203989.29035: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203989.29039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203989.29042: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203989.29095: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203989.29098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203989.29103: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203989.29143: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203989.30789: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203989.30826: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203989.30863: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmps2m8vr_s /root/.ansible/tmp/ansible-tmp-1727203989.2543745-8990-35204977443859/AnsiballZ_command.py <<< 7491 1727203989.30900: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203989.31693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203989.31798: stderr chunk (state=3): >>><<< 7491 1727203989.31801: stdout chunk (state=3): >>><<< 7491 1727203989.31820: done transferring module to remote 7491 1727203989.31830: _low_level_execute_command(): starting 7491 1727203989.31833: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203989.2543745-8990-35204977443859/ /root/.ansible/tmp/ansible-tmp-1727203989.2543745-8990-35204977443859/AnsiballZ_command.py && sleep 0' 7491 1727203989.32284: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203989.32296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203989.32317: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203989.32330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203989.32348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203989.32387: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203989.32400: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203989.32447: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203989.34098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203989.34149: stderr chunk (state=3): >>><<< 7491 1727203989.34153: stdout chunk (state=3): >>><<< 7491 1727203989.34171: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203989.34174: _low_level_execute_command(): starting 7491 1727203989.34179: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203989.2543745-8990-35204977443859/AnsiballZ_command.py && sleep 0' 7491 1727203989.34621: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203989.34629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203989.34659: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203989.34674: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203989.34727: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203989.34741: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203989.34796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203989.48648: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-24 14:53:09.477067", "end": "2024-09-24 14:53:09.485418", "delta": "0:00:00.008351", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7491 1727203989.50727: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203989.50789: stderr chunk (state=3): >>><<< 7491 1727203989.50795: stdout chunk (state=3): >>><<< 7491 1727203989.50809: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-24 14:53:09.477067", "end": "2024-09-24 14:53:09.485418", "delta": "0:00:00.008351", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203989.50841: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add veth0 type veth peer name peerveth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203989.2543745-8990-35204977443859/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203989.50850: _low_level_execute_command(): starting 7491 1727203989.50855: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203989.2543745-8990-35204977443859/ > /dev/null 2>&1 && sleep 0' 7491 1727203989.51327: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203989.51331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203989.51369: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203989.51372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203989.51379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203989.51381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203989.51430: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203989.51433: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203989.51491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203989.54390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203989.54454: stderr chunk (state=3): >>><<< 7491 1727203989.54457: stdout chunk (state=3): >>><<< 7491 1727203989.54475: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203989.54481: handler run complete 7491 1727203989.54503: Evaluated conditional (False): False 7491 1727203989.54511: attempt loop complete, returning result 7491 1727203989.54529: variable 'item' from source: unknown 7491 1727203989.54593: variable 'item' from source: unknown ok: [managed-node3] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0" ], "delta": "0:00:00.008351", "end": "2024-09-24 14:53:09.485418", "item": "ip link add veth0 type veth peer name peerveth0", "rc": 0, "start": "2024-09-24 14:53:09.477067" } 7491 1727203989.54774: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203989.54777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203989.54780: variable 'omit' from source: magic vars 7491 1727203989.54851: variable 'ansible_distribution_major_version' from source: facts 7491 1727203989.54855: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203989.54980: variable 'type' from source: play vars 7491 1727203989.54984: variable 'state' from source: include params 7491 1727203989.54987: variable 'interface' from source: play vars 7491 1727203989.54991: variable 'current_interfaces' from source: set_fact 7491 1727203989.54998: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 7491 1727203989.55001: variable 'omit' from source: magic vars 7491 1727203989.55015: variable 'omit' from source: magic vars 7491 1727203989.55043: variable 'item' from source: unknown 7491 1727203989.55087: variable 'item' from source: unknown 7491 1727203989.55098: variable 'omit' from source: magic vars 7491 1727203989.55119: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203989.55130: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203989.55136: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203989.55146: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203989.55149: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203989.55152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203989.55204: Set connection var ansible_timeout to 10 7491 1727203989.55207: Set connection var ansible_pipelining to False 7491 1727203989.55213: Set connection var ansible_shell_type to sh 7491 1727203989.55220: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203989.55233: Set connection var ansible_shell_executable to /bin/sh 7491 1727203989.55240: Set connection var ansible_connection to ssh 7491 1727203989.55254: variable 'ansible_shell_executable' from source: unknown 7491 1727203989.55257: variable 'ansible_connection' from source: unknown 7491 1727203989.55259: variable 'ansible_module_compression' from source: unknown 7491 1727203989.55262: variable 'ansible_shell_type' from source: unknown 7491 1727203989.55269: variable 'ansible_shell_executable' from source: unknown 7491 1727203989.55271: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203989.55274: variable 'ansible_pipelining' from source: unknown 7491 1727203989.55276: variable 'ansible_timeout' from source: unknown 7491 1727203989.55278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203989.55347: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203989.55356: variable 'omit' from source: magic vars 7491 1727203989.55359: starting attempt loop 7491 1727203989.55361: running the handler 7491 1727203989.55370: _low_level_execute_command(): starting 7491 1727203989.55373: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203989.55827: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203989.55839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203989.55862: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203989.55878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203989.55896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203989.55933: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203989.55945: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203989.55999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203989.57511: stdout chunk (state=3): >>>/root <<< 7491 1727203989.57609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203989.57666: stderr chunk (state=3): >>><<< 7491 1727203989.57670: stdout chunk (state=3): >>><<< 7491 1727203989.57683: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203989.57691: _low_level_execute_command(): starting 7491 1727203989.57696: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203989.5768325-8990-133253320544395 `" && echo ansible-tmp-1727203989.5768325-8990-133253320544395="` echo /root/.ansible/tmp/ansible-tmp-1727203989.5768325-8990-133253320544395 `" ) && sleep 0' 7491 1727203989.58162: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203989.58171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203989.58196: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203989.58208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203989.58218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203989.58263: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203989.58282: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203989.58330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203989.60110: stdout chunk (state=3): >>>ansible-tmp-1727203989.5768325-8990-133253320544395=/root/.ansible/tmp/ansible-tmp-1727203989.5768325-8990-133253320544395 <<< 7491 1727203989.60225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203989.60276: stderr chunk (state=3): >>><<< 7491 1727203989.60279: stdout chunk (state=3): >>><<< 7491 1727203989.60296: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203989.5768325-8990-133253320544395=/root/.ansible/tmp/ansible-tmp-1727203989.5768325-8990-133253320544395 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203989.60322: variable 'ansible_module_compression' from source: unknown 7491 1727203989.60351: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7491 1727203989.60367: variable 'ansible_facts' from source: unknown 7491 1727203989.60418: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203989.5768325-8990-133253320544395/AnsiballZ_command.py 7491 1727203989.60514: Sending initial data 7491 1727203989.60520: Sent initial data (154 bytes) 7491 1727203989.61199: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203989.61211: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203989.61234: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203989.61246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203989.61256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203989.61313: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203989.61320: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203989.61377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203989.63020: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203989.63054: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203989.63093: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmp56zrsf6o /root/.ansible/tmp/ansible-tmp-1727203989.5768325-8990-133253320544395/AnsiballZ_command.py <<< 7491 1727203989.63129: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203989.63984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203989.64077: stderr chunk (state=3): >>><<< 7491 1727203989.64081: stdout chunk (state=3): >>><<< 7491 1727203989.64098: done transferring module to remote 7491 1727203989.64105: _low_level_execute_command(): starting 7491 1727203989.64110: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203989.5768325-8990-133253320544395/ /root/.ansible/tmp/ansible-tmp-1727203989.5768325-8990-133253320544395/AnsiballZ_command.py && sleep 0' 7491 1727203989.64572: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203989.64578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203989.64612: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203989.64627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203989.64676: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203989.64688: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203989.64735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203989.66401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203989.66449: stderr chunk (state=3): >>><<< 7491 1727203989.66453: stdout chunk (state=3): >>><<< 7491 1727203989.66467: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203989.66470: _low_level_execute_command(): starting 7491 1727203989.66476: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203989.5768325-8990-133253320544395/AnsiballZ_command.py && sleep 0' 7491 1727203989.66913: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203989.66922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203989.66969: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203989.66972: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203989.66974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203989.67027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203989.67030: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203989.67088: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203989.80400: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-24 14:53:09.799845", "end": "2024-09-24 14:53:09.803238", "delta": "0:00:00.003393", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7491 1727203989.81589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203989.81623: stderr chunk (state=3): >>><<< 7491 1727203989.81627: stdout chunk (state=3): >>><<< 7491 1727203989.81643: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-24 14:53:09.799845", "end": "2024-09-24 14:53:09.803238", "delta": "0:00:00.003393", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203989.81680: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerveth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203989.5768325-8990-133253320544395/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203989.81686: _low_level_execute_command(): starting 7491 1727203989.81691: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203989.5768325-8990-133253320544395/ > /dev/null 2>&1 && sleep 0' 7491 1727203989.82336: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203989.82350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203989.82367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203989.82387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203989.82437: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203989.82449: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203989.82468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203989.82487: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203989.82507: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203989.82522: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203989.82539: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203989.82555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203989.82572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203989.82585: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203989.82598: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203989.82611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203989.82694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203989.82712: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203989.82728: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203989.82811: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203989.84530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203989.84606: stderr chunk (state=3): >>><<< 7491 1727203989.84612: stdout chunk (state=3): >>><<< 7491 1727203989.84653: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203989.84656: handler run complete 7491 1727203989.84658: Evaluated conditional (False): False 7491 1727203989.84695: attempt loop complete, returning result 7491 1727203989.84698: variable 'item' from source: unknown 7491 1727203989.85108: variable 'item' from source: unknown ok: [managed-node3] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerveth0", "up" ], "delta": "0:00:00.003393", "end": "2024-09-24 14:53:09.803238", "item": "ip link set peerveth0 up", "rc": 0, "start": "2024-09-24 14:53:09.799845" } 7491 1727203989.85502: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203989.85505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203989.85507: variable 'omit' from source: magic vars 7491 1727203989.85509: variable 'ansible_distribution_major_version' from source: facts 7491 1727203989.85511: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203989.85513: variable 'type' from source: play vars 7491 1727203989.85515: variable 'state' from source: include params 7491 1727203989.85520: variable 'interface' from source: play vars 7491 1727203989.85522: variable 'current_interfaces' from source: set_fact 7491 1727203989.85524: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 7491 1727203989.85526: variable 'omit' from source: magic vars 7491 1727203989.85528: variable 'omit' from source: magic vars 7491 1727203989.85530: variable 'item' from source: unknown 7491 1727203989.85532: variable 'item' from source: unknown 7491 1727203989.85534: variable 'omit' from source: magic vars 7491 1727203989.85536: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203989.85538: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203989.85541: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203989.85543: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203989.85545: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203989.85548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203989.86275: Set connection var ansible_timeout to 10 7491 1727203989.86279: Set connection var ansible_pipelining to False 7491 1727203989.86281: Set connection var ansible_shell_type to sh 7491 1727203989.86283: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203989.86285: Set connection var ansible_shell_executable to /bin/sh 7491 1727203989.86286: Set connection var ansible_connection to ssh 7491 1727203989.86288: variable 'ansible_shell_executable' from source: unknown 7491 1727203989.86290: variable 'ansible_connection' from source: unknown 7491 1727203989.86292: variable 'ansible_module_compression' from source: unknown 7491 1727203989.86294: variable 'ansible_shell_type' from source: unknown 7491 1727203989.86295: variable 'ansible_shell_executable' from source: unknown 7491 1727203989.86302: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203989.86304: variable 'ansible_pipelining' from source: unknown 7491 1727203989.86306: variable 'ansible_timeout' from source: unknown 7491 1727203989.86307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203989.86310: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203989.86311: variable 'omit' from source: magic vars 7491 1727203989.86313: starting attempt loop 7491 1727203989.86315: running the handler 7491 1727203989.86320: _low_level_execute_command(): starting 7491 1727203989.86322: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203989.86406: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203989.86418: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203989.86430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203989.86443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203989.86491: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203989.86497: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203989.86500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203989.86516: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203989.86528: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203989.86538: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203989.86548: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203989.86561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203989.86582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203989.86593: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203989.86604: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203989.86620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203989.86707: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203989.86739: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203989.86755: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203989.86836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203989.88329: stdout chunk (state=3): >>>/root <<< 7491 1727203989.88432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203989.88484: stderr chunk (state=3): >>><<< 7491 1727203989.88487: stdout chunk (state=3): >>><<< 7491 1727203989.88502: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203989.88509: _low_level_execute_command(): starting 7491 1727203989.88514: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203989.885013-8990-142070118943049 `" && echo ansible-tmp-1727203989.885013-8990-142070118943049="` echo /root/.ansible/tmp/ansible-tmp-1727203989.885013-8990-142070118943049 `" ) && sleep 0' 7491 1727203989.88954: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203989.88959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203989.88993: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203989.89002: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203989.89007: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 7491 1727203989.89013: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203989.89030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203989.89035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203989.89095: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203989.89099: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203989.89150: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203989.90943: stdout chunk (state=3): >>>ansible-tmp-1727203989.885013-8990-142070118943049=/root/.ansible/tmp/ansible-tmp-1727203989.885013-8990-142070118943049 <<< 7491 1727203989.91055: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203989.91110: stderr chunk (state=3): >>><<< 7491 1727203989.91113: stdout chunk (state=3): >>><<< 7491 1727203989.91127: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203989.885013-8990-142070118943049=/root/.ansible/tmp/ansible-tmp-1727203989.885013-8990-142070118943049 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203989.91147: variable 'ansible_module_compression' from source: unknown 7491 1727203989.91182: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7491 1727203989.91197: variable 'ansible_facts' from source: unknown 7491 1727203989.91242: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203989.885013-8990-142070118943049/AnsiballZ_command.py 7491 1727203989.91339: Sending initial data 7491 1727203989.91342: Sent initial data (153 bytes) 7491 1727203989.92012: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203989.92020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203989.92071: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203989.92074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203989.92076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203989.92131: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203989.92135: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203989.92186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203989.93871: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203989.93901: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203989.93941: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmp_qv30qi_ /root/.ansible/tmp/ansible-tmp-1727203989.885013-8990-142070118943049/AnsiballZ_command.py <<< 7491 1727203989.93975: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203989.94766: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203989.94878: stderr chunk (state=3): >>><<< 7491 1727203989.94882: stdout chunk (state=3): >>><<< 7491 1727203989.94898: done transferring module to remote 7491 1727203989.94905: _low_level_execute_command(): starting 7491 1727203989.94909: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203989.885013-8990-142070118943049/ /root/.ansible/tmp/ansible-tmp-1727203989.885013-8990-142070118943049/AnsiballZ_command.py && sleep 0' 7491 1727203989.95397: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203989.95401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203989.95434: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203989.95437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203989.95439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203989.95441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203989.95496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203989.95501: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203989.95543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203989.97259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203989.97315: stderr chunk (state=3): >>><<< 7491 1727203989.97323: stdout chunk (state=3): >>><<< 7491 1727203989.97336: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203989.97339: _low_level_execute_command(): starting 7491 1727203989.97344: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203989.885013-8990-142070118943049/AnsiballZ_command.py && sleep 0' 7491 1727203989.97812: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203989.97816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203989.97850: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203989.97863: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203989.97920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203989.97933: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203989.97989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203990.11608: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-24 14:53:10.108702", "end": "2024-09-24 14:53:10.113070", "delta": "0:00:00.004368", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7491 1727203990.12567: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203990.12631: stderr chunk (state=3): >>><<< 7491 1727203990.12635: stdout chunk (state=3): >>><<< 7491 1727203990.12654: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-24 14:53:10.108702", "end": "2024-09-24 14:53:10.113070", "delta": "0:00:00.004368", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203990.12677: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set veth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203989.885013-8990-142070118943049/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203990.12682: _low_level_execute_command(): starting 7491 1727203990.12687: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203989.885013-8990-142070118943049/ > /dev/null 2>&1 && sleep 0' 7491 1727203990.13443: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203990.13487: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203990.15211: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203990.15269: stderr chunk (state=3): >>><<< 7491 1727203990.15273: stdout chunk (state=3): >>><<< 7491 1727203990.15285: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203990.15290: handler run complete 7491 1727203990.15306: Evaluated conditional (False): False 7491 1727203990.15313: attempt loop complete, returning result 7491 1727203990.15336: variable 'item' from source: unknown 7491 1727203990.15399: variable 'item' from source: unknown ok: [managed-node3] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "veth0", "up" ], "delta": "0:00:00.004368", "end": "2024-09-24 14:53:10.113070", "item": "ip link set veth0 up", "rc": 0, "start": "2024-09-24 14:53:10.108702" } 7491 1727203990.15515: dumping result to json 7491 1727203990.15518: done dumping result, returning 7491 1727203990.15521: done running TaskExecutor() for managed-node3/TASK: Create veth interface veth0 [0affcd87-79f5-0a4a-ad01-0000000010ae] 7491 1727203990.15523: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000010ae 7491 1727203990.15572: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000010ae 7491 1727203990.15575: WORKER PROCESS EXITING 7491 1727203990.15633: no more pending results, returning what we have 7491 1727203990.15637: results queue empty 7491 1727203990.15637: checking for any_errors_fatal 7491 1727203990.15648: done checking for any_errors_fatal 7491 1727203990.15649: checking for max_fail_percentage 7491 1727203990.15650: done checking for max_fail_percentage 7491 1727203990.15651: checking to see if all hosts have failed and the running result is not ok 7491 1727203990.15658: done checking to see if all hosts have failed 7491 1727203990.15658: getting the remaining hosts for this loop 7491 1727203990.15660: done getting the remaining hosts for this loop 7491 1727203990.15669: getting the next task for host managed-node3 7491 1727203990.15675: done getting next task for host managed-node3 7491 1727203990.15677: ^ task is: TASK: Set up veth as managed by NetworkManager 7491 1727203990.15679: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203990.15683: getting variables 7491 1727203990.15686: in VariableManager get_vars() 7491 1727203990.15735: Calling all_inventory to load vars for managed-node3 7491 1727203990.15738: Calling groups_inventory to load vars for managed-node3 7491 1727203990.15740: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203990.15750: Calling all_plugins_play to load vars for managed-node3 7491 1727203990.15752: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203990.15755: Calling groups_plugins_play to load vars for managed-node3 7491 1727203990.17206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203990.18856: done with get_vars() 7491 1727203990.18893: done getting variables 7491 1727203990.18961: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 14:53:10 -0400 (0:00:00.979) 0:00:32.113 ***** 7491 1727203990.18997: entering _queue_task() for managed-node3/command 7491 1727203990.19329: worker is 1 (out of 1 available) 7491 1727203990.19342: exiting _queue_task() for managed-node3/command 7491 1727203990.19355: done queuing things up, now waiting for results queue to drain 7491 1727203990.19356: waiting for pending results... 7491 1727203990.19647: running TaskExecutor() for managed-node3/TASK: Set up veth as managed by NetworkManager 7491 1727203990.19761: in run() - task 0affcd87-79f5-0a4a-ad01-0000000010af 7491 1727203990.19784: variable 'ansible_search_path' from source: unknown 7491 1727203990.19791: variable 'ansible_search_path' from source: unknown 7491 1727203990.19836: calling self._execute() 7491 1727203990.19947: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203990.19957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203990.19975: variable 'omit' from source: magic vars 7491 1727203990.20367: variable 'ansible_distribution_major_version' from source: facts 7491 1727203990.20387: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203990.20550: variable 'type' from source: play vars 7491 1727203990.20567: variable 'state' from source: include params 7491 1727203990.20577: Evaluated conditional (type == 'veth' and state == 'present'): True 7491 1727203990.20587: variable 'omit' from source: magic vars 7491 1727203990.20631: variable 'omit' from source: magic vars 7491 1727203990.20739: variable 'interface' from source: play vars 7491 1727203990.20760: variable 'omit' from source: magic vars 7491 1727203990.20811: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203990.20850: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203990.20878: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203990.20903: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203990.20918: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203990.20951: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203990.20959: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203990.20968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203990.21074: Set connection var ansible_timeout to 10 7491 1727203990.21085: Set connection var ansible_pipelining to False 7491 1727203990.21094: Set connection var ansible_shell_type to sh 7491 1727203990.21106: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203990.21117: Set connection var ansible_shell_executable to /bin/sh 7491 1727203990.21126: Set connection var ansible_connection to ssh 7491 1727203990.21151: variable 'ansible_shell_executable' from source: unknown 7491 1727203990.21158: variable 'ansible_connection' from source: unknown 7491 1727203990.21166: variable 'ansible_module_compression' from source: unknown 7491 1727203990.21173: variable 'ansible_shell_type' from source: unknown 7491 1727203990.21179: variable 'ansible_shell_executable' from source: unknown 7491 1727203990.21185: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203990.21192: variable 'ansible_pipelining' from source: unknown 7491 1727203990.21199: variable 'ansible_timeout' from source: unknown 7491 1727203990.21207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203990.21350: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203990.21367: variable 'omit' from source: magic vars 7491 1727203990.21376: starting attempt loop 7491 1727203990.21382: running the handler 7491 1727203990.21401: _low_level_execute_command(): starting 7491 1727203990.21413: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203990.22198: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203990.22214: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203990.22230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203990.22250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203990.22297: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203990.22313: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203990.22328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203990.22347: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203990.22360: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203990.22375: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203990.22390: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203990.22406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203990.22426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203990.22438: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203990.22447: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203990.22459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203990.22533: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203990.22555: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203990.22572: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203990.22649: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203990.24181: stdout chunk (state=3): >>>/root <<< 7491 1727203990.24384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203990.24388: stdout chunk (state=3): >>><<< 7491 1727203990.24391: stderr chunk (state=3): >>><<< 7491 1727203990.24519: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203990.24523: _low_level_execute_command(): starting 7491 1727203990.24534: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203990.244145-9028-27608724811105 `" && echo ansible-tmp-1727203990.244145-9028-27608724811105="` echo /root/.ansible/tmp/ansible-tmp-1727203990.244145-9028-27608724811105 `" ) && sleep 0' 7491 1727203990.25110: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203990.25128: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203990.25142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203990.25158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203990.25205: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203990.25219: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203990.25233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203990.25249: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203990.25259: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203990.25271: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203990.25281: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203990.25292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203990.25305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203990.25315: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203990.25328: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203990.25341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203990.25419: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203990.25441: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203990.25456: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203990.25529: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203990.27332: stdout chunk (state=3): >>>ansible-tmp-1727203990.244145-9028-27608724811105=/root/.ansible/tmp/ansible-tmp-1727203990.244145-9028-27608724811105 <<< 7491 1727203990.27470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203990.27601: stderr chunk (state=3): >>><<< 7491 1727203990.27614: stdout chunk (state=3): >>><<< 7491 1727203990.27777: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203990.244145-9028-27608724811105=/root/.ansible/tmp/ansible-tmp-1727203990.244145-9028-27608724811105 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203990.27780: variable 'ansible_module_compression' from source: unknown 7491 1727203990.27783: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7491 1727203990.27886: variable 'ansible_facts' from source: unknown 7491 1727203990.27944: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203990.244145-9028-27608724811105/AnsiballZ_command.py 7491 1727203990.28115: Sending initial data 7491 1727203990.28120: Sent initial data (152 bytes) 7491 1727203990.29082: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203990.29088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203990.29130: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203990.29133: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203990.29136: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203990.29139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203990.29141: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203990.29197: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203990.29201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203990.29205: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203990.29246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203990.30904: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203990.30938: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203990.30977: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmp1p7wfn6b /root/.ansible/tmp/ansible-tmp-1727203990.244145-9028-27608724811105/AnsiballZ_command.py <<< 7491 1727203990.31013: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203990.31967: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203990.32069: stderr chunk (state=3): >>><<< 7491 1727203990.32072: stdout chunk (state=3): >>><<< 7491 1727203990.32094: done transferring module to remote 7491 1727203990.32103: _low_level_execute_command(): starting 7491 1727203990.32107: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203990.244145-9028-27608724811105/ /root/.ansible/tmp/ansible-tmp-1727203990.244145-9028-27608724811105/AnsiballZ_command.py && sleep 0' 7491 1727203990.32559: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203990.32563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203990.32597: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203990.32602: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203990.32604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203990.32657: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203990.32660: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203990.32707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203990.34397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203990.34465: stderr chunk (state=3): >>><<< 7491 1727203990.34468: stdout chunk (state=3): >>><<< 7491 1727203990.34566: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203990.34571: _low_level_execute_command(): starting 7491 1727203990.34574: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203990.244145-9028-27608724811105/AnsiballZ_command.py && sleep 0' 7491 1727203990.35241: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203990.35245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203990.35286: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203990.35289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203990.35291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203990.35348: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203990.35352: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203990.35354: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203990.35405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203990.50261: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-24 14:53:10.482672", "end": "2024-09-24 14:53:10.501710", "delta": "0:00:00.019038", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7491 1727203990.51515: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203990.51522: stdout chunk (state=3): >>><<< 7491 1727203990.51525: stderr chunk (state=3): >>><<< 7491 1727203990.51551: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-24 14:53:10.482672", "end": "2024-09-24 14:53:10.501710", "delta": "0:00:00.019038", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203990.51593: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set veth0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203990.244145-9028-27608724811105/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203990.51602: _low_level_execute_command(): starting 7491 1727203990.51605: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203990.244145-9028-27608724811105/ > /dev/null 2>&1 && sleep 0' 7491 1727203990.52327: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203990.52345: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203990.52358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203990.52376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203990.52420: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203990.52424: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203990.52435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203990.52455: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203990.52468: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203990.52474: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203990.52482: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203990.52492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203990.52503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203990.52510: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203990.52520: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203990.52527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203990.52610: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203990.52629: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203990.52642: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203990.52718: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203990.54542: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203990.54546: stdout chunk (state=3): >>><<< 7491 1727203990.54553: stderr chunk (state=3): >>><<< 7491 1727203990.54573: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203990.54578: handler run complete 7491 1727203990.54604: Evaluated conditional (False): False 7491 1727203990.54615: attempt loop complete, returning result 7491 1727203990.54620: _execute() done 7491 1727203990.54623: dumping result to json 7491 1727203990.54625: done dumping result, returning 7491 1727203990.54634: done running TaskExecutor() for managed-node3/TASK: Set up veth as managed by NetworkManager [0affcd87-79f5-0a4a-ad01-0000000010af] 7491 1727203990.54639: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000010af 7491 1727203990.54749: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000010af 7491 1727203990.54752: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "nmcli", "d", "set", "veth0", "managed", "true" ], "delta": "0:00:00.019038", "end": "2024-09-24 14:53:10.501710", "rc": 0, "start": "2024-09-24 14:53:10.482672" } 7491 1727203990.54814: no more pending results, returning what we have 7491 1727203990.54820: results queue empty 7491 1727203990.54822: checking for any_errors_fatal 7491 1727203990.54838: done checking for any_errors_fatal 7491 1727203990.54838: checking for max_fail_percentage 7491 1727203990.54840: done checking for max_fail_percentage 7491 1727203990.54841: checking to see if all hosts have failed and the running result is not ok 7491 1727203990.54842: done checking to see if all hosts have failed 7491 1727203990.54843: getting the remaining hosts for this loop 7491 1727203990.54845: done getting the remaining hosts for this loop 7491 1727203990.54849: getting the next task for host managed-node3 7491 1727203990.54854: done getting next task for host managed-node3 7491 1727203990.54856: ^ task is: TASK: Delete veth interface {{ interface }} 7491 1727203990.54859: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203990.54871: getting variables 7491 1727203990.54873: in VariableManager get_vars() 7491 1727203990.54922: Calling all_inventory to load vars for managed-node3 7491 1727203990.54925: Calling groups_inventory to load vars for managed-node3 7491 1727203990.54927: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203990.54937: Calling all_plugins_play to load vars for managed-node3 7491 1727203990.54939: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203990.54942: Calling groups_plugins_play to load vars for managed-node3 7491 1727203990.56460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203990.58537: done with get_vars() 7491 1727203990.58572: done getting variables 7491 1727203990.58632: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203990.58749: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 14:53:10 -0400 (0:00:00.397) 0:00:32.511 ***** 7491 1727203990.58782: entering _queue_task() for managed-node3/command 7491 1727203990.59092: worker is 1 (out of 1 available) 7491 1727203990.59103: exiting _queue_task() for managed-node3/command 7491 1727203990.59116: done queuing things up, now waiting for results queue to drain 7491 1727203990.59118: waiting for pending results... 7491 1727203990.59398: running TaskExecutor() for managed-node3/TASK: Delete veth interface veth0 7491 1727203990.59508: in run() - task 0affcd87-79f5-0a4a-ad01-0000000010b0 7491 1727203990.59529: variable 'ansible_search_path' from source: unknown 7491 1727203990.59536: variable 'ansible_search_path' from source: unknown 7491 1727203990.59582: calling self._execute() 7491 1727203990.59688: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203990.59699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203990.59714: variable 'omit' from source: magic vars 7491 1727203990.60078: variable 'ansible_distribution_major_version' from source: facts 7491 1727203990.60099: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203990.60306: variable 'type' from source: play vars 7491 1727203990.60322: variable 'state' from source: include params 7491 1727203990.60332: variable 'interface' from source: play vars 7491 1727203990.60339: variable 'current_interfaces' from source: set_fact 7491 1727203990.60351: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 7491 1727203990.60359: when evaluation is False, skipping this task 7491 1727203990.60367: _execute() done 7491 1727203990.60375: dumping result to json 7491 1727203990.60383: done dumping result, returning 7491 1727203990.60392: done running TaskExecutor() for managed-node3/TASK: Delete veth interface veth0 [0affcd87-79f5-0a4a-ad01-0000000010b0] 7491 1727203990.60405: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000010b0 7491 1727203990.60507: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000010b0 7491 1727203990.60513: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7491 1727203990.60574: no more pending results, returning what we have 7491 1727203990.60578: results queue empty 7491 1727203990.60579: checking for any_errors_fatal 7491 1727203990.60589: done checking for any_errors_fatal 7491 1727203990.60589: checking for max_fail_percentage 7491 1727203990.60591: done checking for max_fail_percentage 7491 1727203990.60592: checking to see if all hosts have failed and the running result is not ok 7491 1727203990.60593: done checking to see if all hosts have failed 7491 1727203990.60594: getting the remaining hosts for this loop 7491 1727203990.60596: done getting the remaining hosts for this loop 7491 1727203990.60600: getting the next task for host managed-node3 7491 1727203990.60606: done getting next task for host managed-node3 7491 1727203990.60608: ^ task is: TASK: Create dummy interface {{ interface }} 7491 1727203990.60611: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203990.60615: getting variables 7491 1727203990.60617: in VariableManager get_vars() 7491 1727203990.60673: Calling all_inventory to load vars for managed-node3 7491 1727203990.60676: Calling groups_inventory to load vars for managed-node3 7491 1727203990.60679: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203990.60694: Calling all_plugins_play to load vars for managed-node3 7491 1727203990.60697: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203990.60701: Calling groups_plugins_play to load vars for managed-node3 7491 1727203990.62571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203990.64202: done with get_vars() 7491 1727203990.64236: done getting variables 7491 1727203990.64294: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203990.64405: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 14:53:10 -0400 (0:00:00.056) 0:00:32.568 ***** 7491 1727203990.64440: entering _queue_task() for managed-node3/command 7491 1727203990.64774: worker is 1 (out of 1 available) 7491 1727203990.64787: exiting _queue_task() for managed-node3/command 7491 1727203990.64801: done queuing things up, now waiting for results queue to drain 7491 1727203990.64802: waiting for pending results... 7491 1727203990.65097: running TaskExecutor() for managed-node3/TASK: Create dummy interface veth0 7491 1727203990.65219: in run() - task 0affcd87-79f5-0a4a-ad01-0000000010b1 7491 1727203990.65243: variable 'ansible_search_path' from source: unknown 7491 1727203990.65254: variable 'ansible_search_path' from source: unknown 7491 1727203990.65299: calling self._execute() 7491 1727203990.65421: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203990.65433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203990.65451: variable 'omit' from source: magic vars 7491 1727203990.65829: variable 'ansible_distribution_major_version' from source: facts 7491 1727203990.65846: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203990.66057: variable 'type' from source: play vars 7491 1727203990.66070: variable 'state' from source: include params 7491 1727203990.66080: variable 'interface' from source: play vars 7491 1727203990.66089: variable 'current_interfaces' from source: set_fact 7491 1727203990.66101: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 7491 1727203990.66109: when evaluation is False, skipping this task 7491 1727203990.66115: _execute() done 7491 1727203990.66128: dumping result to json 7491 1727203990.66136: done dumping result, returning 7491 1727203990.66145: done running TaskExecutor() for managed-node3/TASK: Create dummy interface veth0 [0affcd87-79f5-0a4a-ad01-0000000010b1] 7491 1727203990.66156: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000010b1 skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7491 1727203990.66302: no more pending results, returning what we have 7491 1727203990.66307: results queue empty 7491 1727203990.66309: checking for any_errors_fatal 7491 1727203990.66316: done checking for any_errors_fatal 7491 1727203990.66317: checking for max_fail_percentage 7491 1727203990.66319: done checking for max_fail_percentage 7491 1727203990.66320: checking to see if all hosts have failed and the running result is not ok 7491 1727203990.66321: done checking to see if all hosts have failed 7491 1727203990.66322: getting the remaining hosts for this loop 7491 1727203990.66324: done getting the remaining hosts for this loop 7491 1727203990.66329: getting the next task for host managed-node3 7491 1727203990.66336: done getting next task for host managed-node3 7491 1727203990.66339: ^ task is: TASK: Delete dummy interface {{ interface }} 7491 1727203990.66342: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203990.66347: getting variables 7491 1727203990.66349: in VariableManager get_vars() 7491 1727203990.66408: Calling all_inventory to load vars for managed-node3 7491 1727203990.66411: Calling groups_inventory to load vars for managed-node3 7491 1727203990.66414: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203990.66429: Calling all_plugins_play to load vars for managed-node3 7491 1727203990.66433: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203990.66437: Calling groups_plugins_play to load vars for managed-node3 7491 1727203990.67383: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000010b1 7491 1727203990.67387: WORKER PROCESS EXITING 7491 1727203990.68126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203990.69932: done with get_vars() 7491 1727203990.69958: done getting variables 7491 1727203990.70020: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203990.70134: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 14:53:10 -0400 (0:00:00.057) 0:00:32.625 ***** 7491 1727203990.70168: entering _queue_task() for managed-node3/command 7491 1727203990.70477: worker is 1 (out of 1 available) 7491 1727203990.70489: exiting _queue_task() for managed-node3/command 7491 1727203990.70502: done queuing things up, now waiting for results queue to drain 7491 1727203990.70503: waiting for pending results... 7491 1727203990.70793: running TaskExecutor() for managed-node3/TASK: Delete dummy interface veth0 7491 1727203990.70912: in run() - task 0affcd87-79f5-0a4a-ad01-0000000010b2 7491 1727203990.70936: variable 'ansible_search_path' from source: unknown 7491 1727203990.70950: variable 'ansible_search_path' from source: unknown 7491 1727203990.70995: calling self._execute() 7491 1727203990.71096: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203990.71105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203990.71117: variable 'omit' from source: magic vars 7491 1727203990.71505: variable 'ansible_distribution_major_version' from source: facts 7491 1727203990.71524: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203990.71747: variable 'type' from source: play vars 7491 1727203990.71758: variable 'state' from source: include params 7491 1727203990.71771: variable 'interface' from source: play vars 7491 1727203990.71780: variable 'current_interfaces' from source: set_fact 7491 1727203990.71793: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 7491 1727203990.71801: when evaluation is False, skipping this task 7491 1727203990.71808: _execute() done 7491 1727203990.71819: dumping result to json 7491 1727203990.71827: done dumping result, returning 7491 1727203990.71837: done running TaskExecutor() for managed-node3/TASK: Delete dummy interface veth0 [0affcd87-79f5-0a4a-ad01-0000000010b2] 7491 1727203990.71849: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000010b2 skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7491 1727203990.72002: no more pending results, returning what we have 7491 1727203990.72008: results queue empty 7491 1727203990.72009: checking for any_errors_fatal 7491 1727203990.72015: done checking for any_errors_fatal 7491 1727203990.72016: checking for max_fail_percentage 7491 1727203990.72018: done checking for max_fail_percentage 7491 1727203990.72019: checking to see if all hosts have failed and the running result is not ok 7491 1727203990.72020: done checking to see if all hosts have failed 7491 1727203990.72021: getting the remaining hosts for this loop 7491 1727203990.72024: done getting the remaining hosts for this loop 7491 1727203990.72028: getting the next task for host managed-node3 7491 1727203990.72035: done getting next task for host managed-node3 7491 1727203990.72038: ^ task is: TASK: Create tap interface {{ interface }} 7491 1727203990.72042: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203990.72046: getting variables 7491 1727203990.72048: in VariableManager get_vars() 7491 1727203990.72111: Calling all_inventory to load vars for managed-node3 7491 1727203990.72115: Calling groups_inventory to load vars for managed-node3 7491 1727203990.72117: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203990.72134: Calling all_plugins_play to load vars for managed-node3 7491 1727203990.72138: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203990.72141: Calling groups_plugins_play to load vars for managed-node3 7491 1727203990.77900: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000010b2 7491 1727203990.77904: WORKER PROCESS EXITING 7491 1727203990.78822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203990.80530: done with get_vars() 7491 1727203990.80555: done getting variables 7491 1727203990.80610: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203990.80708: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 14:53:10 -0400 (0:00:00.105) 0:00:32.731 ***** 7491 1727203990.80733: entering _queue_task() for managed-node3/command 7491 1727203990.81058: worker is 1 (out of 1 available) 7491 1727203990.81075: exiting _queue_task() for managed-node3/command 7491 1727203990.81091: done queuing things up, now waiting for results queue to drain 7491 1727203990.81093: waiting for pending results... 7491 1727203990.81412: running TaskExecutor() for managed-node3/TASK: Create tap interface veth0 7491 1727203990.81541: in run() - task 0affcd87-79f5-0a4a-ad01-0000000010b3 7491 1727203990.81565: variable 'ansible_search_path' from source: unknown 7491 1727203990.81575: variable 'ansible_search_path' from source: unknown 7491 1727203990.81620: calling self._execute() 7491 1727203990.81733: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203990.81744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203990.81766: variable 'omit' from source: magic vars 7491 1727203990.82162: variable 'ansible_distribution_major_version' from source: facts 7491 1727203990.82186: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203990.82413: variable 'type' from source: play vars 7491 1727203990.82427: variable 'state' from source: include params 7491 1727203990.82438: variable 'interface' from source: play vars 7491 1727203990.82447: variable 'current_interfaces' from source: set_fact 7491 1727203990.82460: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 7491 1727203990.82469: when evaluation is False, skipping this task 7491 1727203990.82476: _execute() done 7491 1727203990.82484: dumping result to json 7491 1727203990.82492: done dumping result, returning 7491 1727203990.82502: done running TaskExecutor() for managed-node3/TASK: Create tap interface veth0 [0affcd87-79f5-0a4a-ad01-0000000010b3] 7491 1727203990.82518: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000010b3 skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7491 1727203990.82674: no more pending results, returning what we have 7491 1727203990.82679: results queue empty 7491 1727203990.82680: checking for any_errors_fatal 7491 1727203990.82690: done checking for any_errors_fatal 7491 1727203990.82691: checking for max_fail_percentage 7491 1727203990.82693: done checking for max_fail_percentage 7491 1727203990.82694: checking to see if all hosts have failed and the running result is not ok 7491 1727203990.82695: done checking to see if all hosts have failed 7491 1727203990.82696: getting the remaining hosts for this loop 7491 1727203990.82698: done getting the remaining hosts for this loop 7491 1727203990.82703: getting the next task for host managed-node3 7491 1727203990.82710: done getting next task for host managed-node3 7491 1727203990.82713: ^ task is: TASK: Delete tap interface {{ interface }} 7491 1727203990.82716: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203990.82720: getting variables 7491 1727203990.82723: in VariableManager get_vars() 7491 1727203990.82785: Calling all_inventory to load vars for managed-node3 7491 1727203990.82788: Calling groups_inventory to load vars for managed-node3 7491 1727203990.82791: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203990.82807: Calling all_plugins_play to load vars for managed-node3 7491 1727203990.82810: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203990.82813: Calling groups_plugins_play to load vars for managed-node3 7491 1727203990.83784: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000010b3 7491 1727203990.83788: WORKER PROCESS EXITING 7491 1727203990.84656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203990.86272: done with get_vars() 7491 1727203990.86298: done getting variables 7491 1727203990.86360: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203990.86479: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 14:53:10 -0400 (0:00:00.057) 0:00:32.789 ***** 7491 1727203990.86512: entering _queue_task() for managed-node3/command 7491 1727203990.86829: worker is 1 (out of 1 available) 7491 1727203990.86841: exiting _queue_task() for managed-node3/command 7491 1727203990.86854: done queuing things up, now waiting for results queue to drain 7491 1727203990.86856: waiting for pending results... 7491 1727203990.87156: running TaskExecutor() for managed-node3/TASK: Delete tap interface veth0 7491 1727203990.87285: in run() - task 0affcd87-79f5-0a4a-ad01-0000000010b4 7491 1727203990.87310: variable 'ansible_search_path' from source: unknown 7491 1727203990.87318: variable 'ansible_search_path' from source: unknown 7491 1727203990.87360: calling self._execute() 7491 1727203990.87478: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203990.87488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203990.87502: variable 'omit' from source: magic vars 7491 1727203990.87897: variable 'ansible_distribution_major_version' from source: facts 7491 1727203990.87917: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203990.88112: variable 'type' from source: play vars 7491 1727203990.88128: variable 'state' from source: include params 7491 1727203990.88136: variable 'interface' from source: play vars 7491 1727203990.88143: variable 'current_interfaces' from source: set_fact 7491 1727203990.88153: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 7491 1727203990.88161: when evaluation is False, skipping this task 7491 1727203990.88168: _execute() done 7491 1727203990.88175: dumping result to json 7491 1727203990.88182: done dumping result, returning 7491 1727203990.88189: done running TaskExecutor() for managed-node3/TASK: Delete tap interface veth0 [0affcd87-79f5-0a4a-ad01-0000000010b4] 7491 1727203990.88200: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000010b4 skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7491 1727203990.88346: no more pending results, returning what we have 7491 1727203990.88351: results queue empty 7491 1727203990.88352: checking for any_errors_fatal 7491 1727203990.88359: done checking for any_errors_fatal 7491 1727203990.88360: checking for max_fail_percentage 7491 1727203990.88362: done checking for max_fail_percentage 7491 1727203990.88363: checking to see if all hosts have failed and the running result is not ok 7491 1727203990.88366: done checking to see if all hosts have failed 7491 1727203990.88367: getting the remaining hosts for this loop 7491 1727203990.88369: done getting the remaining hosts for this loop 7491 1727203990.88374: getting the next task for host managed-node3 7491 1727203990.88385: done getting next task for host managed-node3 7491 1727203990.88392: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7491 1727203990.88396: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203990.88423: getting variables 7491 1727203990.88426: in VariableManager get_vars() 7491 1727203990.88483: Calling all_inventory to load vars for managed-node3 7491 1727203990.88486: Calling groups_inventory to load vars for managed-node3 7491 1727203990.88490: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203990.88503: Calling all_plugins_play to load vars for managed-node3 7491 1727203990.88507: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203990.88510: Calling groups_plugins_play to load vars for managed-node3 7491 1727203990.89504: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000010b4 7491 1727203990.89508: WORKER PROCESS EXITING 7491 1727203990.90485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203990.93477: done with get_vars() 7491 1727203990.93624: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:53:10 -0400 (0:00:00.073) 0:00:32.862 ***** 7491 1727203990.93845: entering _queue_task() for managed-node3/include_tasks 7491 1727203990.94503: worker is 1 (out of 1 available) 7491 1727203990.94518: exiting _queue_task() for managed-node3/include_tasks 7491 1727203990.94532: done queuing things up, now waiting for results queue to drain 7491 1727203990.94533: waiting for pending results... 7491 1727203990.95346: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7491 1727203990.95754: in run() - task 0affcd87-79f5-0a4a-ad01-0000000000b8 7491 1727203990.95779: variable 'ansible_search_path' from source: unknown 7491 1727203990.95788: variable 'ansible_search_path' from source: unknown 7491 1727203990.95837: calling self._execute() 7491 1727203990.96060: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203990.96133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203990.96149: variable 'omit' from source: magic vars 7491 1727203990.96897: variable 'ansible_distribution_major_version' from source: facts 7491 1727203990.96909: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203990.96916: _execute() done 7491 1727203990.96922: dumping result to json 7491 1727203990.96926: done dumping result, returning 7491 1727203990.96935: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-0a4a-ad01-0000000000b8] 7491 1727203990.96941: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000b8 7491 1727203990.97038: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000b8 7491 1727203990.97041: WORKER PROCESS EXITING 7491 1727203990.97093: no more pending results, returning what we have 7491 1727203990.97098: in VariableManager get_vars() 7491 1727203990.97159: Calling all_inventory to load vars for managed-node3 7491 1727203990.97162: Calling groups_inventory to load vars for managed-node3 7491 1727203990.97166: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203990.97179: Calling all_plugins_play to load vars for managed-node3 7491 1727203990.97181: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203990.97184: Calling groups_plugins_play to load vars for managed-node3 7491 1727203990.98866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203991.00683: done with get_vars() 7491 1727203991.00709: variable 'ansible_search_path' from source: unknown 7491 1727203991.00710: variable 'ansible_search_path' from source: unknown 7491 1727203991.00768: we have included files to process 7491 1727203991.00770: generating all_blocks data 7491 1727203991.00772: done generating all_blocks data 7491 1727203991.00777: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7491 1727203991.00778: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7491 1727203991.00781: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7491 1727203991.01420: done processing included file 7491 1727203991.01422: iterating over new_blocks loaded from include file 7491 1727203991.01423: in VariableManager get_vars() 7491 1727203991.01473: done with get_vars() 7491 1727203991.01475: filtering new block on tags 7491 1727203991.01499: done filtering new block on tags 7491 1727203991.01501: in VariableManager get_vars() 7491 1727203991.01532: done with get_vars() 7491 1727203991.01533: filtering new block on tags 7491 1727203991.01553: done filtering new block on tags 7491 1727203991.01556: in VariableManager get_vars() 7491 1727203991.01586: done with get_vars() 7491 1727203991.01588: filtering new block on tags 7491 1727203991.01611: done filtering new block on tags 7491 1727203991.01614: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 7491 1727203991.01620: extending task lists for all hosts with included blocks 7491 1727203991.02524: done extending task lists 7491 1727203991.02526: done processing included files 7491 1727203991.02527: results queue empty 7491 1727203991.02527: checking for any_errors_fatal 7491 1727203991.02530: done checking for any_errors_fatal 7491 1727203991.02531: checking for max_fail_percentage 7491 1727203991.02532: done checking for max_fail_percentage 7491 1727203991.02533: checking to see if all hosts have failed and the running result is not ok 7491 1727203991.02534: done checking to see if all hosts have failed 7491 1727203991.02535: getting the remaining hosts for this loop 7491 1727203991.02536: done getting the remaining hosts for this loop 7491 1727203991.02538: getting the next task for host managed-node3 7491 1727203991.02542: done getting next task for host managed-node3 7491 1727203991.02547: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7491 1727203991.02550: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203991.02560: getting variables 7491 1727203991.02561: in VariableManager get_vars() 7491 1727203991.02588: Calling all_inventory to load vars for managed-node3 7491 1727203991.02591: Calling groups_inventory to load vars for managed-node3 7491 1727203991.02593: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203991.02598: Calling all_plugins_play to load vars for managed-node3 7491 1727203991.02600: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203991.02603: Calling groups_plugins_play to load vars for managed-node3 7491 1727203991.05306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203991.09747: done with get_vars() 7491 1727203991.09778: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:53:11 -0400 (0:00:00.160) 0:00:33.022 ***** 7491 1727203991.09862: entering _queue_task() for managed-node3/setup 7491 1727203991.10188: worker is 1 (out of 1 available) 7491 1727203991.10199: exiting _queue_task() for managed-node3/setup 7491 1727203991.10211: done queuing things up, now waiting for results queue to drain 7491 1727203991.10213: waiting for pending results... 7491 1727203991.10973: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7491 1727203991.11737: in run() - task 0affcd87-79f5-0a4a-ad01-000000001381 7491 1727203991.11760: variable 'ansible_search_path' from source: unknown 7491 1727203991.11773: variable 'ansible_search_path' from source: unknown 7491 1727203991.11820: calling self._execute() 7491 1727203991.11929: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203991.11942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203991.11957: variable 'omit' from source: magic vars 7491 1727203991.12555: variable 'ansible_distribution_major_version' from source: facts 7491 1727203991.12787: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203991.13009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203991.17590: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203991.17800: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203991.17848: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203991.18035: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203991.18067: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203991.18155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203991.18248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203991.18353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203991.18402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203991.18560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203991.18623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203991.18656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203991.18686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203991.18803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203991.18887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203991.19172: variable '__network_required_facts' from source: role '' defaults 7491 1727203991.19322: variable 'ansible_facts' from source: unknown 7491 1727203991.20915: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 7491 1727203991.20934: when evaluation is False, skipping this task 7491 1727203991.20943: _execute() done 7491 1727203991.21051: dumping result to json 7491 1727203991.21061: done dumping result, returning 7491 1727203991.21076: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-0a4a-ad01-000000001381] 7491 1727203991.21088: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001381 skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7491 1727203991.21242: no more pending results, returning what we have 7491 1727203991.21246: results queue empty 7491 1727203991.21247: checking for any_errors_fatal 7491 1727203991.21248: done checking for any_errors_fatal 7491 1727203991.21249: checking for max_fail_percentage 7491 1727203991.21251: done checking for max_fail_percentage 7491 1727203991.21252: checking to see if all hosts have failed and the running result is not ok 7491 1727203991.21253: done checking to see if all hosts have failed 7491 1727203991.21253: getting the remaining hosts for this loop 7491 1727203991.21255: done getting the remaining hosts for this loop 7491 1727203991.21259: getting the next task for host managed-node3 7491 1727203991.21271: done getting next task for host managed-node3 7491 1727203991.21274: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 7491 1727203991.21279: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203991.21299: getting variables 7491 1727203991.21301: in VariableManager get_vars() 7491 1727203991.21353: Calling all_inventory to load vars for managed-node3 7491 1727203991.21356: Calling groups_inventory to load vars for managed-node3 7491 1727203991.21359: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203991.21372: Calling all_plugins_play to load vars for managed-node3 7491 1727203991.21375: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203991.21379: Calling groups_plugins_play to load vars for managed-node3 7491 1727203991.22017: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001381 7491 1727203991.22021: WORKER PROCESS EXITING 7491 1727203991.23909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203991.27480: done with get_vars() 7491 1727203991.27516: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:53:11 -0400 (0:00:00.178) 0:00:33.201 ***** 7491 1727203991.27747: entering _queue_task() for managed-node3/stat 7491 1727203991.28638: worker is 1 (out of 1 available) 7491 1727203991.28649: exiting _queue_task() for managed-node3/stat 7491 1727203991.28665: done queuing things up, now waiting for results queue to drain 7491 1727203991.28667: waiting for pending results... 7491 1727203991.29360: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 7491 1727203991.29634: in run() - task 0affcd87-79f5-0a4a-ad01-000000001383 7491 1727203991.29645: variable 'ansible_search_path' from source: unknown 7491 1727203991.29649: variable 'ansible_search_path' from source: unknown 7491 1727203991.29689: calling self._execute() 7491 1727203991.29783: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203991.29786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203991.29800: variable 'omit' from source: magic vars 7491 1727203991.30694: variable 'ansible_distribution_major_version' from source: facts 7491 1727203991.30706: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203991.31026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727203991.31655: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727203991.31704: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727203991.31898: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727203991.31903: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727203991.32256: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727203991.32399: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727203991.32427: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203991.32453: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727203991.32667: variable '__network_is_ostree' from source: set_fact 7491 1727203991.32675: Evaluated conditional (not __network_is_ostree is defined): False 7491 1727203991.32678: when evaluation is False, skipping this task 7491 1727203991.32681: _execute() done 7491 1727203991.32683: dumping result to json 7491 1727203991.32687: done dumping result, returning 7491 1727203991.32809: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-0a4a-ad01-000000001383] 7491 1727203991.32817: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001383 7491 1727203991.32918: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001383 7491 1727203991.32922: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7491 1727203991.32983: no more pending results, returning what we have 7491 1727203991.32987: results queue empty 7491 1727203991.32988: checking for any_errors_fatal 7491 1727203991.32997: done checking for any_errors_fatal 7491 1727203991.32998: checking for max_fail_percentage 7491 1727203991.33000: done checking for max_fail_percentage 7491 1727203991.33001: checking to see if all hosts have failed and the running result is not ok 7491 1727203991.33002: done checking to see if all hosts have failed 7491 1727203991.33003: getting the remaining hosts for this loop 7491 1727203991.33005: done getting the remaining hosts for this loop 7491 1727203991.33009: getting the next task for host managed-node3 7491 1727203991.33017: done getting next task for host managed-node3 7491 1727203991.33021: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7491 1727203991.33025: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203991.33048: getting variables 7491 1727203991.33050: in VariableManager get_vars() 7491 1727203991.33102: Calling all_inventory to load vars for managed-node3 7491 1727203991.33105: Calling groups_inventory to load vars for managed-node3 7491 1727203991.33107: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203991.33117: Calling all_plugins_play to load vars for managed-node3 7491 1727203991.33120: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203991.33123: Calling groups_plugins_play to load vars for managed-node3 7491 1727203991.36231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203991.39708: done with get_vars() 7491 1727203991.39742: done getting variables 7491 1727203991.39804: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:53:11 -0400 (0:00:00.120) 0:00:33.322 ***** 7491 1727203991.39842: entering _queue_task() for managed-node3/set_fact 7491 1727203991.40348: worker is 1 (out of 1 available) 7491 1727203991.40363: exiting _queue_task() for managed-node3/set_fact 7491 1727203991.40377: done queuing things up, now waiting for results queue to drain 7491 1727203991.40379: waiting for pending results... 7491 1727203991.42030: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7491 1727203991.42200: in run() - task 0affcd87-79f5-0a4a-ad01-000000001384 7491 1727203991.42213: variable 'ansible_search_path' from source: unknown 7491 1727203991.42217: variable 'ansible_search_path' from source: unknown 7491 1727203991.42259: calling self._execute() 7491 1727203991.42360: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203991.42366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203991.43488: variable 'omit' from source: magic vars 7491 1727203991.44198: variable 'ansible_distribution_major_version' from source: facts 7491 1727203991.44210: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203991.44887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727203991.45257: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727203991.45897: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727203991.46009: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727203991.46044: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727203991.46709: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727203991.46744: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727203991.46872: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203991.46902: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727203991.46998: variable '__network_is_ostree' from source: set_fact 7491 1727203991.47006: Evaluated conditional (not __network_is_ostree is defined): False 7491 1727203991.47009: when evaluation is False, skipping this task 7491 1727203991.47012: _execute() done 7491 1727203991.47014: dumping result to json 7491 1727203991.47017: done dumping result, returning 7491 1727203991.47029: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-0a4a-ad01-000000001384] 7491 1727203991.47035: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001384 7491 1727203991.47134: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001384 7491 1727203991.47137: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7491 1727203991.47190: no more pending results, returning what we have 7491 1727203991.47194: results queue empty 7491 1727203991.47196: checking for any_errors_fatal 7491 1727203991.47204: done checking for any_errors_fatal 7491 1727203991.47205: checking for max_fail_percentage 7491 1727203991.47206: done checking for max_fail_percentage 7491 1727203991.47208: checking to see if all hosts have failed and the running result is not ok 7491 1727203991.47209: done checking to see if all hosts have failed 7491 1727203991.47210: getting the remaining hosts for this loop 7491 1727203991.47212: done getting the remaining hosts for this loop 7491 1727203991.47217: getting the next task for host managed-node3 7491 1727203991.47227: done getting next task for host managed-node3 7491 1727203991.47231: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 7491 1727203991.47235: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203991.47261: getting variables 7491 1727203991.47266: in VariableManager get_vars() 7491 1727203991.47326: Calling all_inventory to load vars for managed-node3 7491 1727203991.47330: Calling groups_inventory to load vars for managed-node3 7491 1727203991.47332: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203991.47345: Calling all_plugins_play to load vars for managed-node3 7491 1727203991.47348: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203991.47351: Calling groups_plugins_play to load vars for managed-node3 7491 1727203991.49846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203991.51717: done with get_vars() 7491 1727203991.51757: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:53:11 -0400 (0:00:00.120) 0:00:33.442 ***** 7491 1727203991.51867: entering _queue_task() for managed-node3/service_facts 7491 1727203991.52207: worker is 1 (out of 1 available) 7491 1727203991.52221: exiting _queue_task() for managed-node3/service_facts 7491 1727203991.52235: done queuing things up, now waiting for results queue to drain 7491 1727203991.52236: waiting for pending results... 7491 1727203991.52554: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 7491 1727203991.52744: in run() - task 0affcd87-79f5-0a4a-ad01-000000001386 7491 1727203991.52767: variable 'ansible_search_path' from source: unknown 7491 1727203991.52776: variable 'ansible_search_path' from source: unknown 7491 1727203991.52821: calling self._execute() 7491 1727203991.52932: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203991.52947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203991.52963: variable 'omit' from source: magic vars 7491 1727203991.53383: variable 'ansible_distribution_major_version' from source: facts 7491 1727203991.53407: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203991.53419: variable 'omit' from source: magic vars 7491 1727203991.53510: variable 'omit' from source: magic vars 7491 1727203991.53551: variable 'omit' from source: magic vars 7491 1727203991.53603: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203991.53644: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203991.53677: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203991.53704: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203991.53720: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203991.53755: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203991.53769: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203991.53785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203991.53917: Set connection var ansible_timeout to 10 7491 1727203991.53930: Set connection var ansible_pipelining to False 7491 1727203991.53939: Set connection var ansible_shell_type to sh 7491 1727203991.53948: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203991.53959: Set connection var ansible_shell_executable to /bin/sh 7491 1727203991.53970: Set connection var ansible_connection to ssh 7491 1727203991.54002: variable 'ansible_shell_executable' from source: unknown 7491 1727203991.54010: variable 'ansible_connection' from source: unknown 7491 1727203991.54018: variable 'ansible_module_compression' from source: unknown 7491 1727203991.54028: variable 'ansible_shell_type' from source: unknown 7491 1727203991.54034: variable 'ansible_shell_executable' from source: unknown 7491 1727203991.54040: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203991.54046: variable 'ansible_pipelining' from source: unknown 7491 1727203991.54052: variable 'ansible_timeout' from source: unknown 7491 1727203991.54059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203991.54294: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7491 1727203991.54325: variable 'omit' from source: magic vars 7491 1727203991.54335: starting attempt loop 7491 1727203991.54342: running the handler 7491 1727203991.54367: _low_level_execute_command(): starting 7491 1727203991.54381: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203991.55222: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203991.55243: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203991.55259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203991.55281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203991.55329: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203991.55346: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203991.55361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203991.55389: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203991.55407: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203991.55425: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203991.55440: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203991.55467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203991.55491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203991.55513: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203991.55533: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203991.55549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203991.55632: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203991.55650: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203991.55668: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203991.55846: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203991.57422: stdout chunk (state=3): >>>/root <<< 7491 1727203991.57617: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203991.57621: stdout chunk (state=3): >>><<< 7491 1727203991.57624: stderr chunk (state=3): >>><<< 7491 1727203991.57751: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203991.57755: _low_level_execute_command(): starting 7491 1727203991.57758: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203991.5764883-9079-20120044041189 `" && echo ansible-tmp-1727203991.5764883-9079-20120044041189="` echo /root/.ansible/tmp/ansible-tmp-1727203991.5764883-9079-20120044041189 `" ) && sleep 0' 7491 1727203991.59106: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203991.59226: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203991.59230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203991.59279: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203991.59283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203991.59286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203991.59289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203991.59390: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203991.59551: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203991.59554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203991.59616: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203991.61419: stdout chunk (state=3): >>>ansible-tmp-1727203991.5764883-9079-20120044041189=/root/.ansible/tmp/ansible-tmp-1727203991.5764883-9079-20120044041189 <<< 7491 1727203991.61534: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203991.61616: stderr chunk (state=3): >>><<< 7491 1727203991.61620: stdout chunk (state=3): >>><<< 7491 1727203991.61874: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203991.5764883-9079-20120044041189=/root/.ansible/tmp/ansible-tmp-1727203991.5764883-9079-20120044041189 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203991.61877: variable 'ansible_module_compression' from source: unknown 7491 1727203991.61880: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 7491 1727203991.61882: variable 'ansible_facts' from source: unknown 7491 1727203991.61884: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203991.5764883-9079-20120044041189/AnsiballZ_service_facts.py 7491 1727203991.63225: Sending initial data 7491 1727203991.63228: Sent initial data (159 bytes) 7491 1727203991.65324: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203991.65340: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203991.65354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203991.65377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203991.65422: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203991.65479: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203991.65493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203991.65509: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203991.65520: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203991.65531: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203991.65542: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203991.65554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203991.65580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203991.65593: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203991.65605: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203991.65705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203991.65783: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203991.65803: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203991.65817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203991.65982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203991.67596: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203991.67628: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203991.67665: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpztp4b039 /root/.ansible/tmp/ansible-tmp-1727203991.5764883-9079-20120044041189/AnsiballZ_service_facts.py <<< 7491 1727203991.67702: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203991.69080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203991.69271: stderr chunk (state=3): >>><<< 7491 1727203991.69275: stdout chunk (state=3): >>><<< 7491 1727203991.69278: done transferring module to remote 7491 1727203991.69280: _low_level_execute_command(): starting 7491 1727203991.69283: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203991.5764883-9079-20120044041189/ /root/.ansible/tmp/ansible-tmp-1727203991.5764883-9079-20120044041189/AnsiballZ_service_facts.py && sleep 0' 7491 1727203991.70780: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203991.70914: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203991.70941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203991.70962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203991.71015: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203991.71031: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203991.71049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203991.71070: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203991.71084: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203991.71095: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203991.71106: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203991.71133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203991.71150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203991.71160: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203991.71240: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203991.71254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203991.71333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203991.71383: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203991.71397: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203991.71571: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203991.73305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203991.73309: stdout chunk (state=3): >>><<< 7491 1727203991.73312: stderr chunk (state=3): >>><<< 7491 1727203991.73416: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203991.73419: _low_level_execute_command(): starting 7491 1727203991.73424: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203991.5764883-9079-20120044041189/AnsiballZ_service_facts.py && sleep 0' 7491 1727203991.75005: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203991.75059: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203991.75078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203991.75169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203991.75217: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203991.75230: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203991.75244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203991.75268: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203991.75285: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203991.75298: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203991.75311: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203991.75325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203991.75342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203991.75378: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203991.75395: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203991.75410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203991.75604: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203991.75626: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203991.75641: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203991.75808: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203992.99692: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", <<< 7491 1727203992.99700: stdout chunk (state=3): >>>"source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-qu<<< 7491 1727203992.99731: stdout chunk (state=3): >>>it-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "syst<<< 7491 1727203992.99770: stdout chunk (state=3): >>>emd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap<<< 7491 1727203992.99774: stdout chunk (state=3): >>>.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "sy<<< 7491 1727203992.99785: stdout chunk (state=3): >>>stemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 7491 1727203993.01117: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203993.01121: stdout chunk (state=3): >>><<< 7491 1727203993.01124: stderr chunk (state=3): >>><<< 7491 1727203993.01774: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203993.02143: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203991.5764883-9079-20120044041189/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203993.02158: _low_level_execute_command(): starting 7491 1727203993.02168: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203991.5764883-9079-20120044041189/ > /dev/null 2>&1 && sleep 0' 7491 1727203993.02784: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203993.02798: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203993.02813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203993.02831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203993.02876: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203993.02888: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203993.02902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203993.02918: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203993.02930: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203993.02940: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203993.02951: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203993.02966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203993.02981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203993.02992: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203993.03001: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203993.03013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203993.03089: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203993.03105: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203993.03120: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203993.03833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203993.05672: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203993.05676: stdout chunk (state=3): >>><<< 7491 1727203993.05678: stderr chunk (state=3): >>><<< 7491 1727203993.05770: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203993.05774: handler run complete 7491 1727203993.05890: variable 'ansible_facts' from source: unknown 7491 1727203993.06023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203993.06421: variable 'ansible_facts' from source: unknown 7491 1727203993.06535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203993.06717: attempt loop complete, returning result 7491 1727203993.06729: _execute() done 7491 1727203993.06737: dumping result to json 7491 1727203993.06797: done dumping result, returning 7491 1727203993.06814: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-0a4a-ad01-000000001386] 7491 1727203993.06826: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001386 ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7491 1727203993.08110: no more pending results, returning what we have 7491 1727203993.08113: results queue empty 7491 1727203993.08116: checking for any_errors_fatal 7491 1727203993.08119: done checking for any_errors_fatal 7491 1727203993.08120: checking for max_fail_percentage 7491 1727203993.08121: done checking for max_fail_percentage 7491 1727203993.08122: checking to see if all hosts have failed and the running result is not ok 7491 1727203993.08123: done checking to see if all hosts have failed 7491 1727203993.08124: getting the remaining hosts for this loop 7491 1727203993.08126: done getting the remaining hosts for this loop 7491 1727203993.08129: getting the next task for host managed-node3 7491 1727203993.08135: done getting next task for host managed-node3 7491 1727203993.08138: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 7491 1727203993.08143: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203993.08153: getting variables 7491 1727203993.08155: in VariableManager get_vars() 7491 1727203993.08198: Calling all_inventory to load vars for managed-node3 7491 1727203993.08201: Calling groups_inventory to load vars for managed-node3 7491 1727203993.08203: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203993.08214: Calling all_plugins_play to load vars for managed-node3 7491 1727203993.08216: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203993.08219: Calling groups_plugins_play to load vars for managed-node3 7491 1727203993.08887: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001386 7491 1727203993.08896: WORKER PROCESS EXITING 7491 1727203993.09538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203993.12025: done with get_vars() 7491 1727203993.12060: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:53:13 -0400 (0:00:01.603) 0:00:35.045 ***** 7491 1727203993.12172: entering _queue_task() for managed-node3/package_facts 7491 1727203993.12529: worker is 1 (out of 1 available) 7491 1727203993.12542: exiting _queue_task() for managed-node3/package_facts 7491 1727203993.12555: done queuing things up, now waiting for results queue to drain 7491 1727203993.12557: waiting for pending results... 7491 1727203993.12869: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 7491 1727203993.13048: in run() - task 0affcd87-79f5-0a4a-ad01-000000001387 7491 1727203993.13072: variable 'ansible_search_path' from source: unknown 7491 1727203993.13080: variable 'ansible_search_path' from source: unknown 7491 1727203993.13126: calling self._execute() 7491 1727203993.13232: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203993.13243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203993.13259: variable 'omit' from source: magic vars 7491 1727203993.13657: variable 'ansible_distribution_major_version' from source: facts 7491 1727203993.13685: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203993.13794: variable 'omit' from source: magic vars 7491 1727203993.13875: variable 'omit' from source: magic vars 7491 1727203993.13923: variable 'omit' from source: magic vars 7491 1727203993.13971: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203993.14013: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203993.14047: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203993.14071: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203993.14089: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203993.14130: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203993.14139: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203993.14147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203993.14256: Set connection var ansible_timeout to 10 7491 1727203993.14271: Set connection var ansible_pipelining to False 7491 1727203993.14282: Set connection var ansible_shell_type to sh 7491 1727203993.14292: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203993.14304: Set connection var ansible_shell_executable to /bin/sh 7491 1727203993.14314: Set connection var ansible_connection to ssh 7491 1727203993.14347: variable 'ansible_shell_executable' from source: unknown 7491 1727203993.14355: variable 'ansible_connection' from source: unknown 7491 1727203993.14363: variable 'ansible_module_compression' from source: unknown 7491 1727203993.14373: variable 'ansible_shell_type' from source: unknown 7491 1727203993.14380: variable 'ansible_shell_executable' from source: unknown 7491 1727203993.14386: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203993.14395: variable 'ansible_pipelining' from source: unknown 7491 1727203993.14403: variable 'ansible_timeout' from source: unknown 7491 1727203993.14410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203993.14627: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7491 1727203993.14644: variable 'omit' from source: magic vars 7491 1727203993.14768: starting attempt loop 7491 1727203993.14778: running the handler 7491 1727203993.14798: _low_level_execute_command(): starting 7491 1727203993.14812: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203993.16657: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203993.16679: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203993.16690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203993.16711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203993.16751: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203993.16758: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203993.16770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203993.16784: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203993.16792: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203993.16799: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203993.16809: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203993.16824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203993.16840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203993.16848: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203993.16857: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203993.16868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203993.16950: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203993.16971: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203993.16984: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203993.17068: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203993.19107: stdout chunk (state=3): >>>/root <<< 7491 1727203993.19201: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203993.19204: stdout chunk (state=3): >>><<< 7491 1727203993.19215: stderr chunk (state=3): >>><<< 7491 1727203993.19839: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203993.19853: _low_level_execute_command(): starting 7491 1727203993.19860: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203993.1983867-9128-216764280958866 `" && echo ansible-tmp-1727203993.1983867-9128-216764280958866="` echo /root/.ansible/tmp/ansible-tmp-1727203993.1983867-9128-216764280958866 `" ) && sleep 0' 7491 1727203993.20833: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203993.20837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203993.20874: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203993.20881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203993.20886: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203993.20897: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 7491 1727203993.20902: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203993.20915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203993.20918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203993.20984: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203993.20997: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203993.21006: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203993.21077: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203993.22880: stdout chunk (state=3): >>>ansible-tmp-1727203993.1983867-9128-216764280958866=/root/.ansible/tmp/ansible-tmp-1727203993.1983867-9128-216764280958866 <<< 7491 1727203993.23069: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203993.23073: stdout chunk (state=3): >>><<< 7491 1727203993.23088: stderr chunk (state=3): >>><<< 7491 1727203993.23171: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203993.1983867-9128-216764280958866=/root/.ansible/tmp/ansible-tmp-1727203993.1983867-9128-216764280958866 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203993.23176: variable 'ansible_module_compression' from source: unknown 7491 1727203993.23470: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 7491 1727203993.23474: variable 'ansible_facts' from source: unknown 7491 1727203993.23477: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203993.1983867-9128-216764280958866/AnsiballZ_package_facts.py 7491 1727203993.23639: Sending initial data 7491 1727203993.23642: Sent initial data (160 bytes) 7491 1727203993.25450: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203993.25488: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203993.25500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203993.25566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203993.25605: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203993.25612: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203993.25625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203993.25638: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203993.25647: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203993.25662: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203993.25691: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203993.25701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203993.25713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203993.25732: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203993.25739: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203993.25748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203993.25851: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203993.25862: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203993.25880: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203993.25953: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203993.27643: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203993.27687: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203993.27726: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpdgtg9i_4 /root/.ansible/tmp/ansible-tmp-1727203993.1983867-9128-216764280958866/AnsiballZ_package_facts.py <<< 7491 1727203993.27761: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203993.30049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203993.30286: stderr chunk (state=3): >>><<< 7491 1727203993.30290: stdout chunk (state=3): >>><<< 7491 1727203993.30293: done transferring module to remote 7491 1727203993.30295: _low_level_execute_command(): starting 7491 1727203993.30297: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203993.1983867-9128-216764280958866/ /root/.ansible/tmp/ansible-tmp-1727203993.1983867-9128-216764280958866/AnsiballZ_package_facts.py && sleep 0' 7491 1727203993.31033: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203993.31067: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203993.31084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203993.31104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203993.31157: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203993.31184: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203993.31200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203993.31219: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203993.31231: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203993.31242: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203993.31254: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203993.31274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203993.31292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203993.31305: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203993.31316: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203993.31331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203993.31425: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203993.31450: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203993.31469: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203993.31552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203993.33281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203993.33393: stderr chunk (state=3): >>><<< 7491 1727203993.33409: stdout chunk (state=3): >>><<< 7491 1727203993.33535: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203993.33538: _low_level_execute_command(): starting 7491 1727203993.33541: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203993.1983867-9128-216764280958866/AnsiballZ_package_facts.py && sleep 0' 7491 1727203993.34289: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203993.34313: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203993.34330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203993.34358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203993.34414: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203993.34428: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203993.34443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203993.34461: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203993.34477: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203993.34488: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203993.34501: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203993.34536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203993.34554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203993.34569: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203993.34581: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203993.34595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203993.34713: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203993.35061: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203993.35142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203993.81424: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "v<<< 7491 1727203993.81441: stdout chunk (state=3): >>>ersion": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", <<< 7491 1727203993.81447: stdout chunk (state=3): >>>"epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "r<<< 7491 1727203993.81510: stdout chunk (state=3): >>>pm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noa<<< 7491 1727203993.81522: stdout chunk (state=3): >>>rch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libsel<<< 7491 1727203993.81527: stdout chunk (state=3): >>>inux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "releas<<< 7491 1727203993.81573: stdout chunk (state=3): >>>e": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt<<< 7491 1727203993.81582: stdout chunk (state=3): >>>-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil<<< 7491 1727203993.81632: stdout chunk (state=3): >>>": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "e<<< 7491 1727203993.81638: stdout chunk (state=3): >>>poch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 7491 1727203993.83390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203993.83395: stdout chunk (state=3): >>><<< 7491 1727203993.83398: stderr chunk (state=3): >>><<< 7491 1727203993.83579: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203993.85879: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203993.1983867-9128-216764280958866/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203993.85910: _low_level_execute_command(): starting 7491 1727203993.85920: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203993.1983867-9128-216764280958866/ > /dev/null 2>&1 && sleep 0' 7491 1727203993.86579: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203993.86594: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203993.86608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203993.86627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203993.86674: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203993.86686: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203993.86699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203993.86715: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203993.86726: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203993.86737: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203993.86748: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203993.86760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203993.86776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203993.86786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203993.86794: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203993.86807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203993.86883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203993.86907: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203993.86925: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203993.87001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203993.88785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203993.88887: stderr chunk (state=3): >>><<< 7491 1727203993.88890: stdout chunk (state=3): >>><<< 7491 1727203993.88908: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203993.88914: handler run complete 7491 1727203993.89965: variable 'ansible_facts' from source: unknown 7491 1727203993.91271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203993.95842: variable 'ansible_facts' from source: unknown 7491 1727203993.96456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203993.97271: attempt loop complete, returning result 7491 1727203993.97288: _execute() done 7491 1727203993.97291: dumping result to json 7491 1727203993.97547: done dumping result, returning 7491 1727203993.97557: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-0a4a-ad01-000000001387] 7491 1727203993.97565: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001387 7491 1727203994.00620: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001387 7491 1727203994.00625: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7491 1727203994.00798: no more pending results, returning what we have 7491 1727203994.00802: results queue empty 7491 1727203994.00803: checking for any_errors_fatal 7491 1727203994.00809: done checking for any_errors_fatal 7491 1727203994.00810: checking for max_fail_percentage 7491 1727203994.00811: done checking for max_fail_percentage 7491 1727203994.00812: checking to see if all hosts have failed and the running result is not ok 7491 1727203994.00814: done checking to see if all hosts have failed 7491 1727203994.00814: getting the remaining hosts for this loop 7491 1727203994.00816: done getting the remaining hosts for this loop 7491 1727203994.00823: getting the next task for host managed-node3 7491 1727203994.00829: done getting next task for host managed-node3 7491 1727203994.00833: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 7491 1727203994.00836: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203994.00849: getting variables 7491 1727203994.00850: in VariableManager get_vars() 7491 1727203994.00897: Calling all_inventory to load vars for managed-node3 7491 1727203994.00901: Calling groups_inventory to load vars for managed-node3 7491 1727203994.00903: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203994.00913: Calling all_plugins_play to load vars for managed-node3 7491 1727203994.00916: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203994.00922: Calling groups_plugins_play to load vars for managed-node3 7491 1727203994.02356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203994.04646: done with get_vars() 7491 1727203994.04689: done getting variables 7491 1727203994.04769: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:53:14 -0400 (0:00:00.926) 0:00:35.971 ***** 7491 1727203994.04807: entering _queue_task() for managed-node3/debug 7491 1727203994.05411: worker is 1 (out of 1 available) 7491 1727203994.05429: exiting _queue_task() for managed-node3/debug 7491 1727203994.05442: done queuing things up, now waiting for results queue to drain 7491 1727203994.05444: waiting for pending results... 7491 1727203994.06021: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 7491 1727203994.06143: in run() - task 0affcd87-79f5-0a4a-ad01-0000000000b9 7491 1727203994.06161: variable 'ansible_search_path' from source: unknown 7491 1727203994.06165: variable 'ansible_search_path' from source: unknown 7491 1727203994.06202: calling self._execute() 7491 1727203994.06304: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203994.06308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203994.06322: variable 'omit' from source: magic vars 7491 1727203994.06824: variable 'ansible_distribution_major_version' from source: facts 7491 1727203994.06854: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203994.06871: variable 'omit' from source: magic vars 7491 1727203994.06947: variable 'omit' from source: magic vars 7491 1727203994.07072: variable 'network_provider' from source: set_fact 7491 1727203994.07095: variable 'omit' from source: magic vars 7491 1727203994.07148: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203994.07201: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203994.07234: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203994.07257: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203994.07284: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203994.07323: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203994.07332: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203994.07341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203994.07467: Set connection var ansible_timeout to 10 7491 1727203994.07479: Set connection var ansible_pipelining to False 7491 1727203994.07493: Set connection var ansible_shell_type to sh 7491 1727203994.07506: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203994.07520: Set connection var ansible_shell_executable to /bin/sh 7491 1727203994.07529: Set connection var ansible_connection to ssh 7491 1727203994.07555: variable 'ansible_shell_executable' from source: unknown 7491 1727203994.07562: variable 'ansible_connection' from source: unknown 7491 1727203994.07572: variable 'ansible_module_compression' from source: unknown 7491 1727203994.07578: variable 'ansible_shell_type' from source: unknown 7491 1727203994.07584: variable 'ansible_shell_executable' from source: unknown 7491 1727203994.07589: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203994.07601: variable 'ansible_pipelining' from source: unknown 7491 1727203994.07612: variable 'ansible_timeout' from source: unknown 7491 1727203994.07622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203994.07781: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203994.07799: variable 'omit' from source: magic vars 7491 1727203994.07808: starting attempt loop 7491 1727203994.07821: running the handler 7491 1727203994.07874: handler run complete 7491 1727203994.07894: attempt loop complete, returning result 7491 1727203994.07900: _execute() done 7491 1727203994.07905: dumping result to json 7491 1727203994.07911: done dumping result, returning 7491 1727203994.07927: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-0a4a-ad01-0000000000b9] 7491 1727203994.07939: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000b9 ok: [managed-node3] => {} MSG: Using network provider: nm 7491 1727203994.08109: no more pending results, returning what we have 7491 1727203994.08113: results queue empty 7491 1727203994.08114: checking for any_errors_fatal 7491 1727203994.08126: done checking for any_errors_fatal 7491 1727203994.08127: checking for max_fail_percentage 7491 1727203994.08129: done checking for max_fail_percentage 7491 1727203994.08130: checking to see if all hosts have failed and the running result is not ok 7491 1727203994.08131: done checking to see if all hosts have failed 7491 1727203994.08132: getting the remaining hosts for this loop 7491 1727203994.08134: done getting the remaining hosts for this loop 7491 1727203994.08138: getting the next task for host managed-node3 7491 1727203994.08146: done getting next task for host managed-node3 7491 1727203994.08151: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7491 1727203994.08153: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203994.08166: getting variables 7491 1727203994.08168: in VariableManager get_vars() 7491 1727203994.08223: Calling all_inventory to load vars for managed-node3 7491 1727203994.08227: Calling groups_inventory to load vars for managed-node3 7491 1727203994.08229: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203994.08242: Calling all_plugins_play to load vars for managed-node3 7491 1727203994.08244: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203994.08247: Calling groups_plugins_play to load vars for managed-node3 7491 1727203994.09287: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000b9 7491 1727203994.09293: WORKER PROCESS EXITING 7491 1727203994.10632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203994.13173: done with get_vars() 7491 1727203994.13214: done getting variables 7491 1727203994.13290: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:53:14 -0400 (0:00:00.085) 0:00:36.057 ***** 7491 1727203994.13330: entering _queue_task() for managed-node3/fail 7491 1727203994.13698: worker is 1 (out of 1 available) 7491 1727203994.13718: exiting _queue_task() for managed-node3/fail 7491 1727203994.13733: done queuing things up, now waiting for results queue to drain 7491 1727203994.13734: waiting for pending results... 7491 1727203994.14193: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7491 1727203994.14354: in run() - task 0affcd87-79f5-0a4a-ad01-0000000000ba 7491 1727203994.14380: variable 'ansible_search_path' from source: unknown 7491 1727203994.14389: variable 'ansible_search_path' from source: unknown 7491 1727203994.14441: calling self._execute() 7491 1727203994.14670: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203994.14682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203994.14699: variable 'omit' from source: magic vars 7491 1727203994.15489: variable 'ansible_distribution_major_version' from source: facts 7491 1727203994.15636: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203994.15821: variable 'network_state' from source: role '' defaults 7491 1727203994.15916: Evaluated conditional (network_state != {}): False 7491 1727203994.15947: when evaluation is False, skipping this task 7491 1727203994.15954: _execute() done 7491 1727203994.15960: dumping result to json 7491 1727203994.15969: done dumping result, returning 7491 1727203994.16033: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-0a4a-ad01-0000000000ba] 7491 1727203994.16058: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000ba skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7491 1727203994.16213: no more pending results, returning what we have 7491 1727203994.16217: results queue empty 7491 1727203994.16218: checking for any_errors_fatal 7491 1727203994.16227: done checking for any_errors_fatal 7491 1727203994.16228: checking for max_fail_percentage 7491 1727203994.16230: done checking for max_fail_percentage 7491 1727203994.16231: checking to see if all hosts have failed and the running result is not ok 7491 1727203994.16232: done checking to see if all hosts have failed 7491 1727203994.16233: getting the remaining hosts for this loop 7491 1727203994.16235: done getting the remaining hosts for this loop 7491 1727203994.16239: getting the next task for host managed-node3 7491 1727203994.16247: done getting next task for host managed-node3 7491 1727203994.16251: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7491 1727203994.16254: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203994.16279: getting variables 7491 1727203994.16281: in VariableManager get_vars() 7491 1727203994.16336: Calling all_inventory to load vars for managed-node3 7491 1727203994.16339: Calling groups_inventory to load vars for managed-node3 7491 1727203994.16342: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203994.16355: Calling all_plugins_play to load vars for managed-node3 7491 1727203994.16358: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203994.16361: Calling groups_plugins_play to load vars for managed-node3 7491 1727203994.17312: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000ba 7491 1727203994.17316: WORKER PROCESS EXITING 7491 1727203994.18247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203994.19983: done with get_vars() 7491 1727203994.20022: done getting variables 7491 1727203994.20090: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:53:14 -0400 (0:00:00.067) 0:00:36.125 ***** 7491 1727203994.20126: entering _queue_task() for managed-node3/fail 7491 1727203994.20473: worker is 1 (out of 1 available) 7491 1727203994.20490: exiting _queue_task() for managed-node3/fail 7491 1727203994.20504: done queuing things up, now waiting for results queue to drain 7491 1727203994.20505: waiting for pending results... 7491 1727203994.20810: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7491 1727203994.20962: in run() - task 0affcd87-79f5-0a4a-ad01-0000000000bb 7491 1727203994.20987: variable 'ansible_search_path' from source: unknown 7491 1727203994.20995: variable 'ansible_search_path' from source: unknown 7491 1727203994.21043: calling self._execute() 7491 1727203994.21154: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203994.21172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203994.21191: variable 'omit' from source: magic vars 7491 1727203994.21610: variable 'ansible_distribution_major_version' from source: facts 7491 1727203994.21633: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203994.21768: variable 'network_state' from source: role '' defaults 7491 1727203994.21785: Evaluated conditional (network_state != {}): False 7491 1727203994.21794: when evaluation is False, skipping this task 7491 1727203994.21804: _execute() done 7491 1727203994.21812: dumping result to json 7491 1727203994.21822: done dumping result, returning 7491 1727203994.21833: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-0a4a-ad01-0000000000bb] 7491 1727203994.21844: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000bb skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7491 1727203994.22007: no more pending results, returning what we have 7491 1727203994.22011: results queue empty 7491 1727203994.22013: checking for any_errors_fatal 7491 1727203994.22020: done checking for any_errors_fatal 7491 1727203994.22021: checking for max_fail_percentage 7491 1727203994.22023: done checking for max_fail_percentage 7491 1727203994.22024: checking to see if all hosts have failed and the running result is not ok 7491 1727203994.22026: done checking to see if all hosts have failed 7491 1727203994.22026: getting the remaining hosts for this loop 7491 1727203994.22029: done getting the remaining hosts for this loop 7491 1727203994.22033: getting the next task for host managed-node3 7491 1727203994.22039: done getting next task for host managed-node3 7491 1727203994.22044: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7491 1727203994.22047: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203994.22075: getting variables 7491 1727203994.22077: in VariableManager get_vars() 7491 1727203994.22131: Calling all_inventory to load vars for managed-node3 7491 1727203994.22134: Calling groups_inventory to load vars for managed-node3 7491 1727203994.22137: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203994.22150: Calling all_plugins_play to load vars for managed-node3 7491 1727203994.22153: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203994.22156: Calling groups_plugins_play to load vars for managed-node3 7491 1727203994.23105: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000bb 7491 1727203994.23109: WORKER PROCESS EXITING 7491 1727203994.24038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203994.25757: done with get_vars() 7491 1727203994.25796: done getting variables 7491 1727203994.25867: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:53:14 -0400 (0:00:00.057) 0:00:36.182 ***** 7491 1727203994.25904: entering _queue_task() for managed-node3/fail 7491 1727203994.26259: worker is 1 (out of 1 available) 7491 1727203994.26277: exiting _queue_task() for managed-node3/fail 7491 1727203994.26291: done queuing things up, now waiting for results queue to drain 7491 1727203994.26293: waiting for pending results... 7491 1727203994.26610: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7491 1727203994.26775: in run() - task 0affcd87-79f5-0a4a-ad01-0000000000bc 7491 1727203994.26794: variable 'ansible_search_path' from source: unknown 7491 1727203994.26802: variable 'ansible_search_path' from source: unknown 7491 1727203994.26847: calling self._execute() 7491 1727203994.26947: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203994.26960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203994.26975: variable 'omit' from source: magic vars 7491 1727203994.27352: variable 'ansible_distribution_major_version' from source: facts 7491 1727203994.27373: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203994.27561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203994.30091: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203994.30172: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203994.30220: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203994.30262: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203994.30299: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203994.30387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203994.30423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203994.30459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203994.30509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203994.30531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203994.30635: variable 'ansible_distribution_major_version' from source: facts 7491 1727203994.30666: Evaluated conditional (ansible_distribution_major_version | int > 9): False 7491 1727203994.30675: when evaluation is False, skipping this task 7491 1727203994.30683: _execute() done 7491 1727203994.30690: dumping result to json 7491 1727203994.30698: done dumping result, returning 7491 1727203994.30711: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-0a4a-ad01-0000000000bc] 7491 1727203994.30721: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000bc skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 7491 1727203994.31015: no more pending results, returning what we have 7491 1727203994.31019: results queue empty 7491 1727203994.31021: checking for any_errors_fatal 7491 1727203994.31029: done checking for any_errors_fatal 7491 1727203994.31030: checking for max_fail_percentage 7491 1727203994.31031: done checking for max_fail_percentage 7491 1727203994.31032: checking to see if all hosts have failed and the running result is not ok 7491 1727203994.31034: done checking to see if all hosts have failed 7491 1727203994.31035: getting the remaining hosts for this loop 7491 1727203994.31037: done getting the remaining hosts for this loop 7491 1727203994.31042: getting the next task for host managed-node3 7491 1727203994.31049: done getting next task for host managed-node3 7491 1727203994.31053: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7491 1727203994.31055: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203994.31080: getting variables 7491 1727203994.31082: in VariableManager get_vars() 7491 1727203994.31139: Calling all_inventory to load vars for managed-node3 7491 1727203994.31142: Calling groups_inventory to load vars for managed-node3 7491 1727203994.31145: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203994.31157: Calling all_plugins_play to load vars for managed-node3 7491 1727203994.31160: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203994.31163: Calling groups_plugins_play to load vars for managed-node3 7491 1727203994.32690: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000bc 7491 1727203994.32694: WORKER PROCESS EXITING 7491 1727203994.33098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203994.35163: done with get_vars() 7491 1727203994.35190: done getting variables 7491 1727203994.35262: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:53:14 -0400 (0:00:00.093) 0:00:36.276 ***** 7491 1727203994.35298: entering _queue_task() for managed-node3/dnf 7491 1727203994.35646: worker is 1 (out of 1 available) 7491 1727203994.35660: exiting _queue_task() for managed-node3/dnf 7491 1727203994.35677: done queuing things up, now waiting for results queue to drain 7491 1727203994.35678: waiting for pending results... 7491 1727203994.35989: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7491 1727203994.36146: in run() - task 0affcd87-79f5-0a4a-ad01-0000000000bd 7491 1727203994.36198: variable 'ansible_search_path' from source: unknown 7491 1727203994.36219: variable 'ansible_search_path' from source: unknown 7491 1727203994.36995: calling self._execute() 7491 1727203994.37225: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203994.37237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203994.37254: variable 'omit' from source: magic vars 7491 1727203994.38102: variable 'ansible_distribution_major_version' from source: facts 7491 1727203994.38123: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203994.38574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203994.43592: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203994.43791: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203994.43846: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203994.43970: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203994.44005: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203994.44122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203994.44290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203994.44322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203994.44481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203994.44501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203994.44741: variable 'ansible_distribution' from source: facts 7491 1727203994.44751: variable 'ansible_distribution_major_version' from source: facts 7491 1727203994.44774: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 7491 1727203994.45129: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727203994.45390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203994.45420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203994.45453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203994.45601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203994.45621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203994.45781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203994.45808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203994.45837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203994.45885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203994.45903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203994.46032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203994.46058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203994.46123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203994.46167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203994.46331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203994.46503: variable 'network_connections' from source: task vars 7491 1727203994.46657: variable 'interface' from source: play vars 7491 1727203994.46738: variable 'interface' from source: play vars 7491 1727203994.46936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727203994.47282: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727203994.47441: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727203994.47541: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727203994.47648: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727203994.47703: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727203994.47858: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727203994.47901: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203994.47933: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727203994.48005: variable '__network_team_connections_defined' from source: role '' defaults 7491 1727203994.48544: variable 'network_connections' from source: task vars 7491 1727203994.48707: variable 'interface' from source: play vars 7491 1727203994.48777: variable 'interface' from source: play vars 7491 1727203994.48825: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7491 1727203994.48921: when evaluation is False, skipping this task 7491 1727203994.48929: _execute() done 7491 1727203994.48937: dumping result to json 7491 1727203994.48945: done dumping result, returning 7491 1727203994.48957: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-0a4a-ad01-0000000000bd] 7491 1727203994.48970: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000bd skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7491 1727203994.49133: no more pending results, returning what we have 7491 1727203994.49137: results queue empty 7491 1727203994.49138: checking for any_errors_fatal 7491 1727203994.49148: done checking for any_errors_fatal 7491 1727203994.49148: checking for max_fail_percentage 7491 1727203994.49151: done checking for max_fail_percentage 7491 1727203994.49152: checking to see if all hosts have failed and the running result is not ok 7491 1727203994.49153: done checking to see if all hosts have failed 7491 1727203994.49154: getting the remaining hosts for this loop 7491 1727203994.49156: done getting the remaining hosts for this loop 7491 1727203994.49161: getting the next task for host managed-node3 7491 1727203994.49170: done getting next task for host managed-node3 7491 1727203994.49175: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7491 1727203994.49177: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203994.49201: getting variables 7491 1727203994.49204: in VariableManager get_vars() 7491 1727203994.49255: Calling all_inventory to load vars for managed-node3 7491 1727203994.49258: Calling groups_inventory to load vars for managed-node3 7491 1727203994.49260: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203994.49273: Calling all_plugins_play to load vars for managed-node3 7491 1727203994.49275: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203994.49278: Calling groups_plugins_play to load vars for managed-node3 7491 1727203994.50599: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000bd 7491 1727203994.50603: WORKER PROCESS EXITING 7491 1727203994.52000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203994.54217: done with get_vars() 7491 1727203994.54376: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7491 1727203994.54581: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:53:14 -0400 (0:00:00.193) 0:00:36.470 ***** 7491 1727203994.54617: entering _queue_task() for managed-node3/yum 7491 1727203994.55096: worker is 1 (out of 1 available) 7491 1727203994.55119: exiting _queue_task() for managed-node3/yum 7491 1727203994.55135: done queuing things up, now waiting for results queue to drain 7491 1727203994.55137: waiting for pending results... 7491 1727203994.55461: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7491 1727203994.55613: in run() - task 0affcd87-79f5-0a4a-ad01-0000000000be 7491 1727203994.55633: variable 'ansible_search_path' from source: unknown 7491 1727203994.55639: variable 'ansible_search_path' from source: unknown 7491 1727203994.55684: calling self._execute() 7491 1727203994.55790: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203994.55803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203994.55818: variable 'omit' from source: magic vars 7491 1727203994.56197: variable 'ansible_distribution_major_version' from source: facts 7491 1727203994.56214: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203994.56393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203994.59704: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203994.59806: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203994.59856: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203994.59906: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203994.59970: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203994.60060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203994.60540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203994.60543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203994.60583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203994.60597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203994.60702: variable 'ansible_distribution_major_version' from source: facts 7491 1727203994.60717: Evaluated conditional (ansible_distribution_major_version | int < 8): False 7491 1727203994.60723: when evaluation is False, skipping this task 7491 1727203994.60726: _execute() done 7491 1727203994.60729: dumping result to json 7491 1727203994.60733: done dumping result, returning 7491 1727203994.60746: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-0a4a-ad01-0000000000be] 7491 1727203994.60752: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000be 7491 1727203994.60852: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000be 7491 1727203994.60855: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 7491 1727203994.60920: no more pending results, returning what we have 7491 1727203994.60924: results queue empty 7491 1727203994.60925: checking for any_errors_fatal 7491 1727203994.60936: done checking for any_errors_fatal 7491 1727203994.60937: checking for max_fail_percentage 7491 1727203994.60939: done checking for max_fail_percentage 7491 1727203994.60940: checking to see if all hosts have failed and the running result is not ok 7491 1727203994.60942: done checking to see if all hosts have failed 7491 1727203994.60943: getting the remaining hosts for this loop 7491 1727203994.60945: done getting the remaining hosts for this loop 7491 1727203994.60950: getting the next task for host managed-node3 7491 1727203994.60957: done getting next task for host managed-node3 7491 1727203994.60962: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7491 1727203994.60967: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203994.60988: getting variables 7491 1727203994.60990: in VariableManager get_vars() 7491 1727203994.61039: Calling all_inventory to load vars for managed-node3 7491 1727203994.61043: Calling groups_inventory to load vars for managed-node3 7491 1727203994.61045: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203994.61055: Calling all_plugins_play to load vars for managed-node3 7491 1727203994.61058: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203994.61061: Calling groups_plugins_play to load vars for managed-node3 7491 1727203994.64531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203994.73451: done with get_vars() 7491 1727203994.73484: done getting variables 7491 1727203994.73540: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:53:14 -0400 (0:00:00.197) 0:00:36.667 ***** 7491 1727203994.74412: entering _queue_task() for managed-node3/fail 7491 1727203994.74774: worker is 1 (out of 1 available) 7491 1727203994.74788: exiting _queue_task() for managed-node3/fail 7491 1727203994.74805: done queuing things up, now waiting for results queue to drain 7491 1727203994.74808: waiting for pending results... 7491 1727203994.75121: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7491 1727203994.75276: in run() - task 0affcd87-79f5-0a4a-ad01-0000000000bf 7491 1727203994.75290: variable 'ansible_search_path' from source: unknown 7491 1727203994.75294: variable 'ansible_search_path' from source: unknown 7491 1727203994.75326: calling self._execute() 7491 1727203994.75412: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203994.75417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203994.75427: variable 'omit' from source: magic vars 7491 1727203994.75728: variable 'ansible_distribution_major_version' from source: facts 7491 1727203994.75738: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203994.75828: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727203994.75967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203994.77916: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203994.78000: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203994.78041: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203994.78083: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203994.78116: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203994.78195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203994.78231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203994.78267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203994.78316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203994.78339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203994.78378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203994.78397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203994.78421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203994.78452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203994.78462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203994.78499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203994.78520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203994.78536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203994.78561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203994.78573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203994.78696: variable 'network_connections' from source: task vars 7491 1727203994.78706: variable 'interface' from source: play vars 7491 1727203994.78761: variable 'interface' from source: play vars 7491 1727203994.78821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727203994.78939: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727203994.78967: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727203994.78989: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727203994.79010: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727203994.79043: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727203994.79061: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727203994.79080: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203994.79097: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727203994.79145: variable '__network_team_connections_defined' from source: role '' defaults 7491 1727203994.79305: variable 'network_connections' from source: task vars 7491 1727203994.79309: variable 'interface' from source: play vars 7491 1727203994.79355: variable 'interface' from source: play vars 7491 1727203994.79385: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7491 1727203994.79389: when evaluation is False, skipping this task 7491 1727203994.79392: _execute() done 7491 1727203994.79395: dumping result to json 7491 1727203994.79396: done dumping result, returning 7491 1727203994.79403: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-0a4a-ad01-0000000000bf] 7491 1727203994.79408: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000bf skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7491 1727203994.79558: no more pending results, returning what we have 7491 1727203994.79562: results queue empty 7491 1727203994.79563: checking for any_errors_fatal 7491 1727203994.79573: done checking for any_errors_fatal 7491 1727203994.79574: checking for max_fail_percentage 7491 1727203994.79576: done checking for max_fail_percentage 7491 1727203994.79577: checking to see if all hosts have failed and the running result is not ok 7491 1727203994.79578: done checking to see if all hosts have failed 7491 1727203994.79579: getting the remaining hosts for this loop 7491 1727203994.79581: done getting the remaining hosts for this loop 7491 1727203994.79585: getting the next task for host managed-node3 7491 1727203994.79591: done getting next task for host managed-node3 7491 1727203994.79597: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 7491 1727203994.79599: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203994.79622: getting variables 7491 1727203994.79624: in VariableManager get_vars() 7491 1727203994.79673: Calling all_inventory to load vars for managed-node3 7491 1727203994.79676: Calling groups_inventory to load vars for managed-node3 7491 1727203994.79678: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203994.79688: Calling all_plugins_play to load vars for managed-node3 7491 1727203994.79690: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203994.79693: Calling groups_plugins_play to load vars for managed-node3 7491 1727203994.80268: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000bf 7491 1727203994.80276: WORKER PROCESS EXITING 7491 1727203994.80863: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203994.82548: done with get_vars() 7491 1727203994.82575: done getting variables 7491 1727203994.82621: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:53:14 -0400 (0:00:00.082) 0:00:36.750 ***** 7491 1727203994.82648: entering _queue_task() for managed-node3/package 7491 1727203994.82880: worker is 1 (out of 1 available) 7491 1727203994.82895: exiting _queue_task() for managed-node3/package 7491 1727203994.82909: done queuing things up, now waiting for results queue to drain 7491 1727203994.82910: waiting for pending results... 7491 1727203994.83107: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 7491 1727203994.83204: in run() - task 0affcd87-79f5-0a4a-ad01-0000000000c0 7491 1727203994.83215: variable 'ansible_search_path' from source: unknown 7491 1727203994.83220: variable 'ansible_search_path' from source: unknown 7491 1727203994.83252: calling self._execute() 7491 1727203994.83333: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203994.83337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203994.83345: variable 'omit' from source: magic vars 7491 1727203994.83635: variable 'ansible_distribution_major_version' from source: facts 7491 1727203994.83645: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203994.83789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727203994.84009: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727203994.84069: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727203994.84180: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727203994.84196: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727203994.84767: variable 'network_packages' from source: role '' defaults 7491 1727203994.84771: variable '__network_provider_setup' from source: role '' defaults 7491 1727203994.84774: variable '__network_service_name_default_nm' from source: role '' defaults 7491 1727203994.84777: variable '__network_service_name_default_nm' from source: role '' defaults 7491 1727203994.84779: variable '__network_packages_default_nm' from source: role '' defaults 7491 1727203994.84781: variable '__network_packages_default_nm' from source: role '' defaults 7491 1727203994.84784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203994.87169: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203994.87213: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203994.87240: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203994.87265: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203994.87287: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203994.87344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203994.87363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203994.87384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203994.87414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203994.87425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203994.87457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203994.87474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203994.87492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203994.87523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203994.87543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203994.87706: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7491 1727203994.87792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203994.87809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203994.87829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203994.87854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203994.87865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203994.87928: variable 'ansible_python' from source: facts 7491 1727203994.87950: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7491 1727203994.88010: variable '__network_wpa_supplicant_required' from source: role '' defaults 7491 1727203994.88070: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7491 1727203994.88155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203994.88174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203994.88191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203994.88215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203994.88226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203994.88310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203994.88338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203994.88382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203994.88449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203994.88476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203994.88722: variable 'network_connections' from source: task vars 7491 1727203994.88734: variable 'interface' from source: play vars 7491 1727203994.88860: variable 'interface' from source: play vars 7491 1727203994.88955: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727203994.88989: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727203994.89038: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203994.89075: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727203994.89136: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727203994.89488: variable 'network_connections' from source: task vars 7491 1727203994.89498: variable 'interface' from source: play vars 7491 1727203994.89622: variable 'interface' from source: play vars 7491 1727203994.89694: variable '__network_packages_default_wireless' from source: role '' defaults 7491 1727203994.89790: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727203994.90504: variable 'network_connections' from source: task vars 7491 1727203994.90507: variable 'interface' from source: play vars 7491 1727203994.90562: variable 'interface' from source: play vars 7491 1727203994.90586: variable '__network_packages_default_team' from source: role '' defaults 7491 1727203994.90643: variable '__network_team_connections_defined' from source: role '' defaults 7491 1727203994.90847: variable 'network_connections' from source: task vars 7491 1727203994.90851: variable 'interface' from source: play vars 7491 1727203994.90899: variable 'interface' from source: play vars 7491 1727203994.90945: variable '__network_service_name_default_initscripts' from source: role '' defaults 7491 1727203994.90990: variable '__network_service_name_default_initscripts' from source: role '' defaults 7491 1727203994.90995: variable '__network_packages_default_initscripts' from source: role '' defaults 7491 1727203994.91039: variable '__network_packages_default_initscripts' from source: role '' defaults 7491 1727203994.91178: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7491 1727203994.91485: variable 'network_connections' from source: task vars 7491 1727203994.91489: variable 'interface' from source: play vars 7491 1727203994.91534: variable 'interface' from source: play vars 7491 1727203994.91544: variable 'ansible_distribution' from source: facts 7491 1727203994.91547: variable '__network_rh_distros' from source: role '' defaults 7491 1727203994.91553: variable 'ansible_distribution_major_version' from source: facts 7491 1727203994.91573: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7491 1727203994.91682: variable 'ansible_distribution' from source: facts 7491 1727203994.91686: variable '__network_rh_distros' from source: role '' defaults 7491 1727203994.91690: variable 'ansible_distribution_major_version' from source: facts 7491 1727203994.91701: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7491 1727203994.91809: variable 'ansible_distribution' from source: facts 7491 1727203994.91812: variable '__network_rh_distros' from source: role '' defaults 7491 1727203994.91816: variable 'ansible_distribution_major_version' from source: facts 7491 1727203994.91846: variable 'network_provider' from source: set_fact 7491 1727203994.91858: variable 'ansible_facts' from source: unknown 7491 1727203994.92279: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 7491 1727203994.92282: when evaluation is False, skipping this task 7491 1727203994.92285: _execute() done 7491 1727203994.92287: dumping result to json 7491 1727203994.92289: done dumping result, returning 7491 1727203994.92292: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-0a4a-ad01-0000000000c0] 7491 1727203994.92294: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000c0 7491 1727203994.92394: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000c0 7491 1727203994.92397: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 7491 1727203994.92448: no more pending results, returning what we have 7491 1727203994.92452: results queue empty 7491 1727203994.92453: checking for any_errors_fatal 7491 1727203994.92458: done checking for any_errors_fatal 7491 1727203994.92459: checking for max_fail_percentage 7491 1727203994.92461: done checking for max_fail_percentage 7491 1727203994.92461: checking to see if all hosts have failed and the running result is not ok 7491 1727203994.92462: done checking to see if all hosts have failed 7491 1727203994.92463: getting the remaining hosts for this loop 7491 1727203994.92467: done getting the remaining hosts for this loop 7491 1727203994.92471: getting the next task for host managed-node3 7491 1727203994.92478: done getting next task for host managed-node3 7491 1727203994.92482: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7491 1727203994.92484: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203994.92504: getting variables 7491 1727203994.92506: in VariableManager get_vars() 7491 1727203994.92557: Calling all_inventory to load vars for managed-node3 7491 1727203994.92560: Calling groups_inventory to load vars for managed-node3 7491 1727203994.92562: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203994.92579: Calling all_plugins_play to load vars for managed-node3 7491 1727203994.92581: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203994.92584: Calling groups_plugins_play to load vars for managed-node3 7491 1727203994.94135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203994.95814: done with get_vars() 7491 1727203994.95837: done getting variables 7491 1727203994.95897: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:53:14 -0400 (0:00:00.132) 0:00:36.883 ***** 7491 1727203994.95932: entering _queue_task() for managed-node3/package 7491 1727203994.96232: worker is 1 (out of 1 available) 7491 1727203994.96244: exiting _queue_task() for managed-node3/package 7491 1727203994.96256: done queuing things up, now waiting for results queue to drain 7491 1727203994.96257: waiting for pending results... 7491 1727203994.96571: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7491 1727203994.96705: in run() - task 0affcd87-79f5-0a4a-ad01-0000000000c1 7491 1727203994.96723: variable 'ansible_search_path' from source: unknown 7491 1727203994.96727: variable 'ansible_search_path' from source: unknown 7491 1727203994.96758: calling self._execute() 7491 1727203994.96861: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203994.96867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203994.96876: variable 'omit' from source: magic vars 7491 1727203994.97276: variable 'ansible_distribution_major_version' from source: facts 7491 1727203994.97287: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203994.97411: variable 'network_state' from source: role '' defaults 7491 1727203994.97422: Evaluated conditional (network_state != {}): False 7491 1727203994.97425: when evaluation is False, skipping this task 7491 1727203994.97429: _execute() done 7491 1727203994.97431: dumping result to json 7491 1727203994.97433: done dumping result, returning 7491 1727203994.97436: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-0a4a-ad01-0000000000c1] 7491 1727203994.97445: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000c1 7491 1727203994.97555: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000c1 7491 1727203994.97557: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7491 1727203994.97611: no more pending results, returning what we have 7491 1727203994.97616: results queue empty 7491 1727203994.97617: checking for any_errors_fatal 7491 1727203994.97624: done checking for any_errors_fatal 7491 1727203994.97624: checking for max_fail_percentage 7491 1727203994.97626: done checking for max_fail_percentage 7491 1727203994.97627: checking to see if all hosts have failed and the running result is not ok 7491 1727203994.97629: done checking to see if all hosts have failed 7491 1727203994.97629: getting the remaining hosts for this loop 7491 1727203994.97632: done getting the remaining hosts for this loop 7491 1727203994.97636: getting the next task for host managed-node3 7491 1727203994.97642: done getting next task for host managed-node3 7491 1727203994.97646: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7491 1727203994.97650: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203994.97673: getting variables 7491 1727203994.97676: in VariableManager get_vars() 7491 1727203994.97730: Calling all_inventory to load vars for managed-node3 7491 1727203994.97733: Calling groups_inventory to load vars for managed-node3 7491 1727203994.97736: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203994.97749: Calling all_plugins_play to load vars for managed-node3 7491 1727203994.97751: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203994.97754: Calling groups_plugins_play to load vars for managed-node3 7491 1727203994.99350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203995.01014: done with get_vars() 7491 1727203995.01044: done getting variables 7491 1727203995.01110: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:53:15 -0400 (0:00:00.052) 0:00:36.935 ***** 7491 1727203995.01146: entering _queue_task() for managed-node3/package 7491 1727203995.01478: worker is 1 (out of 1 available) 7491 1727203995.01492: exiting _queue_task() for managed-node3/package 7491 1727203995.01505: done queuing things up, now waiting for results queue to drain 7491 1727203995.01506: waiting for pending results... 7491 1727203995.01822: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7491 1727203995.01978: in run() - task 0affcd87-79f5-0a4a-ad01-0000000000c2 7491 1727203995.02001: variable 'ansible_search_path' from source: unknown 7491 1727203995.02011: variable 'ansible_search_path' from source: unknown 7491 1727203995.02054: calling self._execute() 7491 1727203995.02162: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203995.02178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203995.02192: variable 'omit' from source: magic vars 7491 1727203995.02595: variable 'ansible_distribution_major_version' from source: facts 7491 1727203995.02618: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203995.02745: variable 'network_state' from source: role '' defaults 7491 1727203995.02761: Evaluated conditional (network_state != {}): False 7491 1727203995.02772: when evaluation is False, skipping this task 7491 1727203995.02779: _execute() done 7491 1727203995.02786: dumping result to json 7491 1727203995.02794: done dumping result, returning 7491 1727203995.02806: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-0a4a-ad01-0000000000c2] 7491 1727203995.02822: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000c2 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7491 1727203995.02984: no more pending results, returning what we have 7491 1727203995.02988: results queue empty 7491 1727203995.02989: checking for any_errors_fatal 7491 1727203995.02998: done checking for any_errors_fatal 7491 1727203995.02999: checking for max_fail_percentage 7491 1727203995.03002: done checking for max_fail_percentage 7491 1727203995.03003: checking to see if all hosts have failed and the running result is not ok 7491 1727203995.03004: done checking to see if all hosts have failed 7491 1727203995.03005: getting the remaining hosts for this loop 7491 1727203995.03007: done getting the remaining hosts for this loop 7491 1727203995.03012: getting the next task for host managed-node3 7491 1727203995.03019: done getting next task for host managed-node3 7491 1727203995.03025: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7491 1727203995.03028: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203995.03052: getting variables 7491 1727203995.03055: in VariableManager get_vars() 7491 1727203995.03112: Calling all_inventory to load vars for managed-node3 7491 1727203995.03116: Calling groups_inventory to load vars for managed-node3 7491 1727203995.03118: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203995.03132: Calling all_plugins_play to load vars for managed-node3 7491 1727203995.03135: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203995.03138: Calling groups_plugins_play to load vars for managed-node3 7491 1727203995.04604: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000c2 7491 1727203995.04608: WORKER PROCESS EXITING 7491 1727203995.05225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203995.07304: done with get_vars() 7491 1727203995.07334: done getting variables 7491 1727203995.07501: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:53:15 -0400 (0:00:00.063) 0:00:36.999 ***** 7491 1727203995.07538: entering _queue_task() for managed-node3/service 7491 1727203995.08046: worker is 1 (out of 1 available) 7491 1727203995.08060: exiting _queue_task() for managed-node3/service 7491 1727203995.08075: done queuing things up, now waiting for results queue to drain 7491 1727203995.08076: waiting for pending results... 7491 1727203995.08962: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7491 1727203995.09116: in run() - task 0affcd87-79f5-0a4a-ad01-0000000000c3 7491 1727203995.09138: variable 'ansible_search_path' from source: unknown 7491 1727203995.09146: variable 'ansible_search_path' from source: unknown 7491 1727203995.09193: calling self._execute() 7491 1727203995.09304: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203995.09315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203995.09334: variable 'omit' from source: magic vars 7491 1727203995.09740: variable 'ansible_distribution_major_version' from source: facts 7491 1727203995.09757: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203995.09896: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727203995.10111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203995.13111: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203995.13204: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203995.13244: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203995.13283: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203995.13314: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203995.13397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203995.13431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203995.13456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203995.13502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203995.13519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203995.13570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203995.13594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203995.13624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203995.13669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203995.13683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203995.13729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203995.13755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203995.13781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203995.13824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203995.13839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203995.14038: variable 'network_connections' from source: task vars 7491 1727203995.14051: variable 'interface' from source: play vars 7491 1727203995.14534: variable 'interface' from source: play vars 7491 1727203995.14622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727203995.14796: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727203995.14878: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727203995.14908: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727203995.14946: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727203995.14991: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727203995.15014: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727203995.15048: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203995.15075: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727203995.15146: variable '__network_team_connections_defined' from source: role '' defaults 7491 1727203995.15419: variable 'network_connections' from source: task vars 7491 1727203995.15423: variable 'interface' from source: play vars 7491 1727203995.15495: variable 'interface' from source: play vars 7491 1727203995.15531: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7491 1727203995.15534: when evaluation is False, skipping this task 7491 1727203995.15537: _execute() done 7491 1727203995.15540: dumping result to json 7491 1727203995.15542: done dumping result, returning 7491 1727203995.15550: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-0a4a-ad01-0000000000c3] 7491 1727203995.15556: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000c3 7491 1727203995.15667: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000c3 7491 1727203995.15677: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7491 1727203995.15723: no more pending results, returning what we have 7491 1727203995.15727: results queue empty 7491 1727203995.15728: checking for any_errors_fatal 7491 1727203995.15735: done checking for any_errors_fatal 7491 1727203995.15736: checking for max_fail_percentage 7491 1727203995.15738: done checking for max_fail_percentage 7491 1727203995.15739: checking to see if all hosts have failed and the running result is not ok 7491 1727203995.15740: done checking to see if all hosts have failed 7491 1727203995.15741: getting the remaining hosts for this loop 7491 1727203995.15743: done getting the remaining hosts for this loop 7491 1727203995.15747: getting the next task for host managed-node3 7491 1727203995.15753: done getting next task for host managed-node3 7491 1727203995.15757: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7491 1727203995.15760: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203995.15781: getting variables 7491 1727203995.15785: in VariableManager get_vars() 7491 1727203995.15837: Calling all_inventory to load vars for managed-node3 7491 1727203995.15840: Calling groups_inventory to load vars for managed-node3 7491 1727203995.15843: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203995.15854: Calling all_plugins_play to load vars for managed-node3 7491 1727203995.15858: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203995.15861: Calling groups_plugins_play to load vars for managed-node3 7491 1727203995.17551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203995.19517: done with get_vars() 7491 1727203995.19541: done getting variables 7491 1727203995.19614: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:53:15 -0400 (0:00:00.121) 0:00:37.120 ***** 7491 1727203995.19649: entering _queue_task() for managed-node3/service 7491 1727203995.20061: worker is 1 (out of 1 available) 7491 1727203995.20077: exiting _queue_task() for managed-node3/service 7491 1727203995.20090: done queuing things up, now waiting for results queue to drain 7491 1727203995.20095: waiting for pending results... 7491 1727203995.20500: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7491 1727203995.20551: in run() - task 0affcd87-79f5-0a4a-ad01-0000000000c4 7491 1727203995.20566: variable 'ansible_search_path' from source: unknown 7491 1727203995.20571: variable 'ansible_search_path' from source: unknown 7491 1727203995.20923: calling self._execute() 7491 1727203995.20927: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203995.20931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203995.20933: variable 'omit' from source: magic vars 7491 1727203995.21132: variable 'ansible_distribution_major_version' from source: facts 7491 1727203995.21159: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203995.21586: variable 'network_provider' from source: set_fact 7491 1727203995.21589: variable 'network_state' from source: role '' defaults 7491 1727203995.21737: Evaluated conditional (network_provider == "nm" or network_state != {}): True 7491 1727203995.21740: variable 'omit' from source: magic vars 7491 1727203995.21743: variable 'omit' from source: magic vars 7491 1727203995.21746: variable 'network_service_name' from source: role '' defaults 7491 1727203995.21774: variable 'network_service_name' from source: role '' defaults 7491 1727203995.21888: variable '__network_provider_setup' from source: role '' defaults 7491 1727203995.21894: variable '__network_service_name_default_nm' from source: role '' defaults 7491 1727203995.21961: variable '__network_service_name_default_nm' from source: role '' defaults 7491 1727203995.21971: variable '__network_packages_default_nm' from source: role '' defaults 7491 1727203995.22038: variable '__network_packages_default_nm' from source: role '' defaults 7491 1727203995.22262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203995.24626: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203995.24839: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203995.24843: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203995.24845: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203995.24848: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203995.24968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203995.24971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203995.24974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203995.24977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203995.24979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203995.25024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203995.25046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203995.25071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203995.25113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203995.25127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203995.25362: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7491 1727203995.25478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203995.25501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203995.25523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203995.25561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203995.25577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203995.25676: variable 'ansible_python' from source: facts 7491 1727203995.25703: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7491 1727203995.25776: variable '__network_wpa_supplicant_required' from source: role '' defaults 7491 1727203995.25850: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7491 1727203995.25978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203995.26004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203995.26025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203995.26062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203995.26078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203995.26127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203995.26159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203995.26173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203995.26216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203995.26223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203995.26347: variable 'network_connections' from source: task vars 7491 1727203995.26355: variable 'interface' from source: play vars 7491 1727203995.26413: variable 'interface' from source: play vars 7491 1727203995.26523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727203995.26766: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727203995.26840: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727203995.26913: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727203995.26962: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727203995.27056: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727203995.27094: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727203995.27153: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203995.27200: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727203995.27269: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727203995.27596: variable 'network_connections' from source: task vars 7491 1727203995.27610: variable 'interface' from source: play vars 7491 1727203995.27707: variable 'interface' from source: play vars 7491 1727203995.27767: variable '__network_packages_default_wireless' from source: role '' defaults 7491 1727203995.27857: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727203995.28314: variable 'network_connections' from source: task vars 7491 1727203995.28359: variable 'interface' from source: play vars 7491 1727203995.28441: variable 'interface' from source: play vars 7491 1727203995.28455: variable '__network_packages_default_team' from source: role '' defaults 7491 1727203995.28512: variable '__network_team_connections_defined' from source: role '' defaults 7491 1727203995.28722: variable 'network_connections' from source: task vars 7491 1727203995.28725: variable 'interface' from source: play vars 7491 1727203995.28776: variable 'interface' from source: play vars 7491 1727203995.28825: variable '__network_service_name_default_initscripts' from source: role '' defaults 7491 1727203995.28869: variable '__network_service_name_default_initscripts' from source: role '' defaults 7491 1727203995.28872: variable '__network_packages_default_initscripts' from source: role '' defaults 7491 1727203995.28919: variable '__network_packages_default_initscripts' from source: role '' defaults 7491 1727203995.29053: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7491 1727203995.29384: variable 'network_connections' from source: task vars 7491 1727203995.29387: variable 'interface' from source: play vars 7491 1727203995.29431: variable 'interface' from source: play vars 7491 1727203995.29440: variable 'ansible_distribution' from source: facts 7491 1727203995.29443: variable '__network_rh_distros' from source: role '' defaults 7491 1727203995.29450: variable 'ansible_distribution_major_version' from source: facts 7491 1727203995.29469: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7491 1727203995.29585: variable 'ansible_distribution' from source: facts 7491 1727203995.29589: variable '__network_rh_distros' from source: role '' defaults 7491 1727203995.29593: variable 'ansible_distribution_major_version' from source: facts 7491 1727203995.29606: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7491 1727203995.29723: variable 'ansible_distribution' from source: facts 7491 1727203995.29726: variable '__network_rh_distros' from source: role '' defaults 7491 1727203995.29729: variable 'ansible_distribution_major_version' from source: facts 7491 1727203995.29755: variable 'network_provider' from source: set_fact 7491 1727203995.29776: variable 'omit' from source: magic vars 7491 1727203995.29798: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203995.29821: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203995.29835: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203995.29848: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203995.29863: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203995.29883: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203995.29886: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203995.29888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203995.29955: Set connection var ansible_timeout to 10 7491 1727203995.29962: Set connection var ansible_pipelining to False 7491 1727203995.29971: Set connection var ansible_shell_type to sh 7491 1727203995.29975: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203995.29981: Set connection var ansible_shell_executable to /bin/sh 7491 1727203995.29986: Set connection var ansible_connection to ssh 7491 1727203995.30005: variable 'ansible_shell_executable' from source: unknown 7491 1727203995.30008: variable 'ansible_connection' from source: unknown 7491 1727203995.30011: variable 'ansible_module_compression' from source: unknown 7491 1727203995.30013: variable 'ansible_shell_type' from source: unknown 7491 1727203995.30015: variable 'ansible_shell_executable' from source: unknown 7491 1727203995.30020: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203995.30022: variable 'ansible_pipelining' from source: unknown 7491 1727203995.30024: variable 'ansible_timeout' from source: unknown 7491 1727203995.30027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203995.30103: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203995.30112: variable 'omit' from source: magic vars 7491 1727203995.30123: starting attempt loop 7491 1727203995.30126: running the handler 7491 1727203995.30224: variable 'ansible_facts' from source: unknown 7491 1727203995.31215: _low_level_execute_command(): starting 7491 1727203995.31220: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203995.31851: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203995.31860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203995.31907: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203995.31914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 7491 1727203995.31922: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203995.31938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203995.31945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203995.32027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203995.32043: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203995.32052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203995.32125: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203995.33756: stdout chunk (state=3): >>>/root <<< 7491 1727203995.33856: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203995.33910: stderr chunk (state=3): >>><<< 7491 1727203995.33919: stdout chunk (state=3): >>><<< 7491 1727203995.33965: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203995.33970: _low_level_execute_command(): starting 7491 1727203995.33973: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203995.3394015-9216-158064861853360 `" && echo ansible-tmp-1727203995.3394015-9216-158064861853360="` echo /root/.ansible/tmp/ansible-tmp-1727203995.3394015-9216-158064861853360 `" ) && sleep 0' 7491 1727203995.34608: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203995.34614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203995.34661: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203995.34668: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 7491 1727203995.34683: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203995.34688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203995.34776: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203995.34793: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203995.34798: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203995.34868: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203995.36671: stdout chunk (state=3): >>>ansible-tmp-1727203995.3394015-9216-158064861853360=/root/.ansible/tmp/ansible-tmp-1727203995.3394015-9216-158064861853360 <<< 7491 1727203995.36782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203995.36848: stderr chunk (state=3): >>><<< 7491 1727203995.36853: stdout chunk (state=3): >>><<< 7491 1727203995.36876: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203995.3394015-9216-158064861853360=/root/.ansible/tmp/ansible-tmp-1727203995.3394015-9216-158064861853360 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203995.36902: variable 'ansible_module_compression' from source: unknown 7491 1727203995.36949: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 7491 1727203995.37000: variable 'ansible_facts' from source: unknown 7491 1727203995.37133: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203995.3394015-9216-158064861853360/AnsiballZ_systemd.py 7491 1727203995.37252: Sending initial data 7491 1727203995.37255: Sent initial data (154 bytes) 7491 1727203995.37961: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203995.37968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203995.38029: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203995.38032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7491 1727203995.38035: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203995.38037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203995.38082: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203995.38088: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203995.38151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203995.39846: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 7491 1727203995.39852: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203995.39908: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203995.39959: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmphh0d2v0m /root/.ansible/tmp/ansible-tmp-1727203995.3394015-9216-158064861853360/AnsiballZ_systemd.py <<< 7491 1727203995.39997: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203995.41943: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203995.42034: stderr chunk (state=3): >>><<< 7491 1727203995.42046: stdout chunk (state=3): >>><<< 7491 1727203995.42078: done transferring module to remote 7491 1727203995.42095: _low_level_execute_command(): starting 7491 1727203995.42105: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203995.3394015-9216-158064861853360/ /root/.ansible/tmp/ansible-tmp-1727203995.3394015-9216-158064861853360/AnsiballZ_systemd.py && sleep 0' 7491 1727203995.42697: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203995.42704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203995.42739: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203995.42744: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203995.42753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203995.42761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203995.42768: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203995.42775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203995.42843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203995.42846: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203995.42900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203995.44836: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203995.44904: stderr chunk (state=3): >>><<< 7491 1727203995.44907: stdout chunk (state=3): >>><<< 7491 1727203995.45000: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203995.45003: _low_level_execute_command(): starting 7491 1727203995.45006: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203995.3394015-9216-158064861853360/AnsiballZ_systemd.py && sleep 0' 7491 1727203995.45616: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203995.45633: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203995.45650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203995.45677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203995.45722: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203995.45734: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203995.45747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203995.45763: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203995.45778: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203995.45792: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203995.45806: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203995.45823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203995.45838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203995.45849: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203995.45858: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203995.45880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203995.45973: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203995.45990: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203995.46010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203995.46099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203995.71891: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "616", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ExecMainStartTimestampMonotonic": "12637094", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "616", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Tue 2024-09-24 14:47:46 EDT] ; stop_time=[n/a] ; pid=616 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Tue 2024-09-24 14:47:46 EDT] ; stop_time=[n/a] ; pid=616 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.sl<<< 7491 1727203995.71925: stdout chunk (state=3): >>>ice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "15171584", "MemoryAvailable": "infinity", "CPUUsageNSec": "200925000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "<<< 7491 1727203995.71943: stdout chunk (state=3): >>>SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.service shutdown.target NetworkManager-wait-online.service multi-user.target network.target", "After": "basic.target dbus.socket system.slice dbus-broker.service network-pre.target systemd-journald.socket cloud-init-local.service sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:47:46 EDT", "StateChangeTimestampMonotonic": "12973041", "InactiveExitTimestamp": "Tue 2024-09-24 14:47:46 EDT", "InactiveExitTimestampMonotonic": "12637298", "ActiveEnterTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ActiveEnterTimestampMonotonic": "12973041", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ConditionTimestampMonotonic": "12630855", "AssertTimestamp": "Tue 2024-09-24 14:47:46 EDT", "AssertTimestampMonotonic": "12630857", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f94263a9def7408cb754f60792d8c658", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 7491 1727203995.73547: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203995.73609: stderr chunk (state=3): >>><<< 7491 1727203995.73613: stdout chunk (state=3): >>><<< 7491 1727203995.73630: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "616", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ExecMainStartTimestampMonotonic": "12637094", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "616", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Tue 2024-09-24 14:47:46 EDT] ; stop_time=[n/a] ; pid=616 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Tue 2024-09-24 14:47:46 EDT] ; stop_time=[n/a] ; pid=616 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "15171584", "MemoryAvailable": "infinity", "CPUUsageNSec": "200925000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.service shutdown.target NetworkManager-wait-online.service multi-user.target network.target", "After": "basic.target dbus.socket system.slice dbus-broker.service network-pre.target systemd-journald.socket cloud-init-local.service sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:47:46 EDT", "StateChangeTimestampMonotonic": "12973041", "InactiveExitTimestamp": "Tue 2024-09-24 14:47:46 EDT", "InactiveExitTimestampMonotonic": "12637298", "ActiveEnterTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ActiveEnterTimestampMonotonic": "12973041", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ConditionTimestampMonotonic": "12630855", "AssertTimestamp": "Tue 2024-09-24 14:47:46 EDT", "AssertTimestampMonotonic": "12630857", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f94263a9def7408cb754f60792d8c658", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203995.73748: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203995.3394015-9216-158064861853360/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203995.73765: _low_level_execute_command(): starting 7491 1727203995.73770: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203995.3394015-9216-158064861853360/ > /dev/null 2>&1 && sleep 0' 7491 1727203995.74248: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203995.74252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203995.74283: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203995.74295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203995.74345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203995.74367: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203995.74420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203995.76243: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203995.76305: stderr chunk (state=3): >>><<< 7491 1727203995.76308: stdout chunk (state=3): >>><<< 7491 1727203995.76326: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203995.76332: handler run complete 7491 1727203995.76372: attempt loop complete, returning result 7491 1727203995.76376: _execute() done 7491 1727203995.76378: dumping result to json 7491 1727203995.76395: done dumping result, returning 7491 1727203995.76404: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-0a4a-ad01-0000000000c4] 7491 1727203995.76409: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000c4 7491 1727203995.76654: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000c4 7491 1727203995.76656: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7491 1727203995.76717: no more pending results, returning what we have 7491 1727203995.76720: results queue empty 7491 1727203995.76722: checking for any_errors_fatal 7491 1727203995.76729: done checking for any_errors_fatal 7491 1727203995.76730: checking for max_fail_percentage 7491 1727203995.76732: done checking for max_fail_percentage 7491 1727203995.76733: checking to see if all hosts have failed and the running result is not ok 7491 1727203995.76734: done checking to see if all hosts have failed 7491 1727203995.76734: getting the remaining hosts for this loop 7491 1727203995.76736: done getting the remaining hosts for this loop 7491 1727203995.76740: getting the next task for host managed-node3 7491 1727203995.76746: done getting next task for host managed-node3 7491 1727203995.76750: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7491 1727203995.76752: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203995.76763: getting variables 7491 1727203995.76767: in VariableManager get_vars() 7491 1727203995.76811: Calling all_inventory to load vars for managed-node3 7491 1727203995.76814: Calling groups_inventory to load vars for managed-node3 7491 1727203995.76816: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203995.76828: Calling all_plugins_play to load vars for managed-node3 7491 1727203995.76831: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203995.76834: Calling groups_plugins_play to load vars for managed-node3 7491 1727203995.77667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203995.79276: done with get_vars() 7491 1727203995.79298: done getting variables 7491 1727203995.79346: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:53:15 -0400 (0:00:00.597) 0:00:37.717 ***** 7491 1727203995.79375: entering _queue_task() for managed-node3/service 7491 1727203995.79609: worker is 1 (out of 1 available) 7491 1727203995.79626: exiting _queue_task() for managed-node3/service 7491 1727203995.79639: done queuing things up, now waiting for results queue to drain 7491 1727203995.79640: waiting for pending results... 7491 1727203995.79839: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7491 1727203995.79933: in run() - task 0affcd87-79f5-0a4a-ad01-0000000000c5 7491 1727203995.79945: variable 'ansible_search_path' from source: unknown 7491 1727203995.79949: variable 'ansible_search_path' from source: unknown 7491 1727203995.79980: calling self._execute() 7491 1727203995.80059: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203995.80066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203995.80076: variable 'omit' from source: magic vars 7491 1727203995.80367: variable 'ansible_distribution_major_version' from source: facts 7491 1727203995.80377: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203995.80461: variable 'network_provider' from source: set_fact 7491 1727203995.80471: Evaluated conditional (network_provider == "nm"): True 7491 1727203995.80537: variable '__network_wpa_supplicant_required' from source: role '' defaults 7491 1727203995.80604: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7491 1727203995.80727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203995.83347: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203995.83419: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203995.83460: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203995.83500: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203995.83533: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203995.83615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203995.83650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203995.83684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203995.83729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203995.83748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203995.83800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203995.83828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203995.83858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203995.83904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203995.83922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203995.83967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203995.83996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203995.84024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203995.84067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203995.84086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203995.84230: variable 'network_connections' from source: task vars 7491 1727203995.84246: variable 'interface' from source: play vars 7491 1727203995.84322: variable 'interface' from source: play vars 7491 1727203995.84403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727203995.84585: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727203995.84626: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727203995.84661: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727203995.84695: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727203995.84742: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727203995.84771: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727203995.84801: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203995.84831: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727203995.84885: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727203995.85133: variable 'network_connections' from source: task vars 7491 1727203995.85144: variable 'interface' from source: play vars 7491 1727203995.85213: variable 'interface' from source: play vars 7491 1727203995.85257: Evaluated conditional (__network_wpa_supplicant_required): False 7491 1727203995.85267: when evaluation is False, skipping this task 7491 1727203995.85273: _execute() done 7491 1727203995.85278: dumping result to json 7491 1727203995.85284: done dumping result, returning 7491 1727203995.85293: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-0a4a-ad01-0000000000c5] 7491 1727203995.85309: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000c5 skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 7491 1727203995.85450: no more pending results, returning what we have 7491 1727203995.85454: results queue empty 7491 1727203995.85455: checking for any_errors_fatal 7491 1727203995.85477: done checking for any_errors_fatal 7491 1727203995.85478: checking for max_fail_percentage 7491 1727203995.85480: done checking for max_fail_percentage 7491 1727203995.85481: checking to see if all hosts have failed and the running result is not ok 7491 1727203995.85482: done checking to see if all hosts have failed 7491 1727203995.85483: getting the remaining hosts for this loop 7491 1727203995.85484: done getting the remaining hosts for this loop 7491 1727203995.85489: getting the next task for host managed-node3 7491 1727203995.85496: done getting next task for host managed-node3 7491 1727203995.85499: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 7491 1727203995.85502: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203995.85528: getting variables 7491 1727203995.85530: in VariableManager get_vars() 7491 1727203995.85581: Calling all_inventory to load vars for managed-node3 7491 1727203995.85584: Calling groups_inventory to load vars for managed-node3 7491 1727203995.85586: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203995.85597: Calling all_plugins_play to load vars for managed-node3 7491 1727203995.85600: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203995.85603: Calling groups_plugins_play to load vars for managed-node3 7491 1727203995.86124: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000c5 7491 1727203995.86127: WORKER PROCESS EXITING 7491 1727203995.87330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203995.89019: done with get_vars() 7491 1727203995.89045: done getting variables 7491 1727203995.89105: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:53:15 -0400 (0:00:00.097) 0:00:37.815 ***** 7491 1727203995.89137: entering _queue_task() for managed-node3/service 7491 1727203995.89484: worker is 1 (out of 1 available) 7491 1727203995.89495: exiting _queue_task() for managed-node3/service 7491 1727203995.89508: done queuing things up, now waiting for results queue to drain 7491 1727203995.89509: waiting for pending results... 7491 1727203995.89841: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 7491 1727203995.90027: in run() - task 0affcd87-79f5-0a4a-ad01-0000000000c6 7491 1727203995.90057: variable 'ansible_search_path' from source: unknown 7491 1727203995.90072: variable 'ansible_search_path' from source: unknown 7491 1727203995.90125: calling self._execute() 7491 1727203995.90256: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203995.90277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203995.90306: variable 'omit' from source: magic vars 7491 1727203995.90856: variable 'ansible_distribution_major_version' from source: facts 7491 1727203995.90877: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203995.91038: variable 'network_provider' from source: set_fact 7491 1727203995.91050: Evaluated conditional (network_provider == "initscripts"): False 7491 1727203995.91063: when evaluation is False, skipping this task 7491 1727203995.91074: _execute() done 7491 1727203995.91082: dumping result to json 7491 1727203995.91089: done dumping result, returning 7491 1727203995.91099: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-0a4a-ad01-0000000000c6] 7491 1727203995.91110: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000c6 skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7491 1727203995.91275: no more pending results, returning what we have 7491 1727203995.91280: results queue empty 7491 1727203995.91281: checking for any_errors_fatal 7491 1727203995.91295: done checking for any_errors_fatal 7491 1727203995.91295: checking for max_fail_percentage 7491 1727203995.91298: done checking for max_fail_percentage 7491 1727203995.91299: checking to see if all hosts have failed and the running result is not ok 7491 1727203995.91300: done checking to see if all hosts have failed 7491 1727203995.91301: getting the remaining hosts for this loop 7491 1727203995.91303: done getting the remaining hosts for this loop 7491 1727203995.91307: getting the next task for host managed-node3 7491 1727203995.91315: done getting next task for host managed-node3 7491 1727203995.91319: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7491 1727203995.91322: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203995.91348: getting variables 7491 1727203995.91351: in VariableManager get_vars() 7491 1727203995.91408: Calling all_inventory to load vars for managed-node3 7491 1727203995.91411: Calling groups_inventory to load vars for managed-node3 7491 1727203995.91413: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203995.91425: Calling all_plugins_play to load vars for managed-node3 7491 1727203995.91428: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203995.91431: Calling groups_plugins_play to load vars for managed-node3 7491 1727203995.92414: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000c6 7491 1727203995.92418: WORKER PROCESS EXITING 7491 1727203995.92808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203995.93743: done with get_vars() 7491 1727203995.93762: done getting variables 7491 1727203995.93831: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:53:15 -0400 (0:00:00.047) 0:00:37.862 ***** 7491 1727203995.93865: entering _queue_task() for managed-node3/copy 7491 1727203995.94237: worker is 1 (out of 1 available) 7491 1727203995.94250: exiting _queue_task() for managed-node3/copy 7491 1727203995.94263: done queuing things up, now waiting for results queue to drain 7491 1727203995.94267: waiting for pending results... 7491 1727203995.94588: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7491 1727203995.94769: in run() - task 0affcd87-79f5-0a4a-ad01-0000000000c7 7491 1727203995.94790: variable 'ansible_search_path' from source: unknown 7491 1727203995.94813: variable 'ansible_search_path' from source: unknown 7491 1727203995.94912: calling self._execute() 7491 1727203995.95062: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203995.95081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203995.95095: variable 'omit' from source: magic vars 7491 1727203995.95396: variable 'ansible_distribution_major_version' from source: facts 7491 1727203995.95406: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203995.95490: variable 'network_provider' from source: set_fact 7491 1727203995.95496: Evaluated conditional (network_provider == "initscripts"): False 7491 1727203995.95499: when evaluation is False, skipping this task 7491 1727203995.95501: _execute() done 7491 1727203995.95505: dumping result to json 7491 1727203995.95507: done dumping result, returning 7491 1727203995.95516: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-0a4a-ad01-0000000000c7] 7491 1727203995.95528: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000c7 7491 1727203995.95624: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000c7 7491 1727203995.95628: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 7491 1727203995.95677: no more pending results, returning what we have 7491 1727203995.95680: results queue empty 7491 1727203995.95681: checking for any_errors_fatal 7491 1727203995.95687: done checking for any_errors_fatal 7491 1727203995.95688: checking for max_fail_percentage 7491 1727203995.95690: done checking for max_fail_percentage 7491 1727203995.95691: checking to see if all hosts have failed and the running result is not ok 7491 1727203995.95692: done checking to see if all hosts have failed 7491 1727203995.95692: getting the remaining hosts for this loop 7491 1727203995.95694: done getting the remaining hosts for this loop 7491 1727203995.95698: getting the next task for host managed-node3 7491 1727203995.95705: done getting next task for host managed-node3 7491 1727203995.95709: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7491 1727203995.95712: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203995.95733: getting variables 7491 1727203995.95735: in VariableManager get_vars() 7491 1727203995.95783: Calling all_inventory to load vars for managed-node3 7491 1727203995.95786: Calling groups_inventory to load vars for managed-node3 7491 1727203995.95788: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203995.95798: Calling all_plugins_play to load vars for managed-node3 7491 1727203995.95800: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203995.95803: Calling groups_plugins_play to load vars for managed-node3 7491 1727203995.96715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203995.98324: done with get_vars() 7491 1727203995.98357: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:53:15 -0400 (0:00:00.045) 0:00:37.908 ***** 7491 1727203995.98451: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 7491 1727203995.98908: worker is 1 (out of 1 available) 7491 1727203995.98923: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 7491 1727203995.98937: done queuing things up, now waiting for results queue to drain 7491 1727203995.98938: waiting for pending results... 7491 1727203995.99160: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7491 1727203995.99255: in run() - task 0affcd87-79f5-0a4a-ad01-0000000000c8 7491 1727203995.99267: variable 'ansible_search_path' from source: unknown 7491 1727203995.99271: variable 'ansible_search_path' from source: unknown 7491 1727203995.99302: calling self._execute() 7491 1727203995.99385: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203995.99389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203995.99398: variable 'omit' from source: magic vars 7491 1727203995.99689: variable 'ansible_distribution_major_version' from source: facts 7491 1727203995.99699: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203995.99705: variable 'omit' from source: magic vars 7491 1727203995.99754: variable 'omit' from source: magic vars 7491 1727203995.99873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727203996.02102: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727203996.02186: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727203996.02237: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727203996.02279: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727203996.02309: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727203996.02398: variable 'network_provider' from source: set_fact 7491 1727203996.02542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727203996.02596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727203996.02632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727203996.02680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727203996.02701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727203996.02786: variable 'omit' from source: magic vars 7491 1727203996.02915: variable 'omit' from source: magic vars 7491 1727203996.03035: variable 'network_connections' from source: task vars 7491 1727203996.03054: variable 'interface' from source: play vars 7491 1727203996.03130: variable 'interface' from source: play vars 7491 1727203996.03279: variable 'omit' from source: magic vars 7491 1727203996.03286: variable '__lsr_ansible_managed' from source: task vars 7491 1727203996.03331: variable '__lsr_ansible_managed' from source: task vars 7491 1727203996.03537: Loaded config def from plugin (lookup/template) 7491 1727203996.03541: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 7491 1727203996.03568: File lookup term: get_ansible_managed.j2 7491 1727203996.03571: variable 'ansible_search_path' from source: unknown 7491 1727203996.03577: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 7491 1727203996.03589: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 7491 1727203996.03603: variable 'ansible_search_path' from source: unknown 7491 1727203996.07889: variable 'ansible_managed' from source: unknown 7491 1727203996.07984: variable 'omit' from source: magic vars 7491 1727203996.08006: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203996.08030: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203996.08049: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203996.08064: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203996.08073: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203996.08096: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203996.08099: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203996.08101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203996.08173: Set connection var ansible_timeout to 10 7491 1727203996.08176: Set connection var ansible_pipelining to False 7491 1727203996.08182: Set connection var ansible_shell_type to sh 7491 1727203996.08187: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203996.08193: Set connection var ansible_shell_executable to /bin/sh 7491 1727203996.08198: Set connection var ansible_connection to ssh 7491 1727203996.08215: variable 'ansible_shell_executable' from source: unknown 7491 1727203996.08218: variable 'ansible_connection' from source: unknown 7491 1727203996.08223: variable 'ansible_module_compression' from source: unknown 7491 1727203996.08226: variable 'ansible_shell_type' from source: unknown 7491 1727203996.08228: variable 'ansible_shell_executable' from source: unknown 7491 1727203996.08230: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203996.08235: variable 'ansible_pipelining' from source: unknown 7491 1727203996.08237: variable 'ansible_timeout' from source: unknown 7491 1727203996.08241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203996.08340: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7491 1727203996.08351: variable 'omit' from source: magic vars 7491 1727203996.08354: starting attempt loop 7491 1727203996.08356: running the handler 7491 1727203996.08373: _low_level_execute_command(): starting 7491 1727203996.08381: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203996.08891: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203996.08912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203996.08935: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203996.08945: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203996.08986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203996.08998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203996.09060: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203996.10714: stdout chunk (state=3): >>>/root <<< 7491 1727203996.10817: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203996.10887: stderr chunk (state=3): >>><<< 7491 1727203996.10890: stdout chunk (state=3): >>><<< 7491 1727203996.10910: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203996.10922: _low_level_execute_command(): starting 7491 1727203996.10926: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203996.109112-9262-53834091205965 `" && echo ansible-tmp-1727203996.109112-9262-53834091205965="` echo /root/.ansible/tmp/ansible-tmp-1727203996.109112-9262-53834091205965 `" ) && sleep 0' 7491 1727203996.11413: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203996.11429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203996.11447: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203996.11458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203996.11474: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203996.11520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203996.11532: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203996.11587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203996.13529: stdout chunk (state=3): >>>ansible-tmp-1727203996.109112-9262-53834091205965=/root/.ansible/tmp/ansible-tmp-1727203996.109112-9262-53834091205965 <<< 7491 1727203996.13626: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203996.13693: stderr chunk (state=3): >>><<< 7491 1727203996.13697: stdout chunk (state=3): >>><<< 7491 1727203996.13714: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203996.109112-9262-53834091205965=/root/.ansible/tmp/ansible-tmp-1727203996.109112-9262-53834091205965 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203996.13763: variable 'ansible_module_compression' from source: unknown 7491 1727203996.13803: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 7491 1727203996.13832: variable 'ansible_facts' from source: unknown 7491 1727203996.13900: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203996.109112-9262-53834091205965/AnsiballZ_network_connections.py 7491 1727203996.14018: Sending initial data 7491 1727203996.14024: Sent initial data (164 bytes) 7491 1727203996.14746: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203996.14752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203996.14780: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203996.14793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203996.14847: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203996.14861: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203996.14920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203996.16666: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203996.16704: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203996.16745: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpf9rb_6zt /root/.ansible/tmp/ansible-tmp-1727203996.109112-9262-53834091205965/AnsiballZ_network_connections.py <<< 7491 1727203996.16790: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203996.17937: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203996.18053: stderr chunk (state=3): >>><<< 7491 1727203996.18057: stdout chunk (state=3): >>><<< 7491 1727203996.18080: done transferring module to remote 7491 1727203996.18090: _low_level_execute_command(): starting 7491 1727203996.18095: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203996.109112-9262-53834091205965/ /root/.ansible/tmp/ansible-tmp-1727203996.109112-9262-53834091205965/AnsiballZ_network_connections.py && sleep 0' 7491 1727203996.18569: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203996.18576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203996.18608: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203996.18622: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203996.18634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203996.18643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203996.18688: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203996.18700: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203996.18751: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203996.20596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203996.20654: stderr chunk (state=3): >>><<< 7491 1727203996.20658: stdout chunk (state=3): >>><<< 7491 1727203996.20676: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203996.20679: _low_level_execute_command(): starting 7491 1727203996.20684: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203996.109112-9262-53834091205965/AnsiballZ_network_connections.py && sleep 0' 7491 1727203996.21148: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203996.21152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203996.21191: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203996.21205: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203996.21216: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203996.21261: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203996.21276: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203996.21337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203996.49530: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 0fe9d42b-408b-41ad-8245-f1fe5397f441\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 0fe9d42b-408b-41ad-8245-f1fe5397f441 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": false, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": false, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 7491 1727203996.51873: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203996.51936: stderr chunk (state=3): >>><<< 7491 1727203996.51940: stdout chunk (state=3): >>><<< 7491 1727203996.51957: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 0fe9d42b-408b-41ad-8245-f1fe5397f441\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 0fe9d42b-408b-41ad-8245-f1fe5397f441 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": false, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": false, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203996.51993: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'type': 'ethernet', 'state': 'up', 'ip': {'auto_gateway': False, 'dhcp4': False, 'auto6': False, 'address': ['2001:db8::2/64', '203.0.113.2/24'], 'gateway6': '2001:db8::1', 'gateway4': '203.0.113.1'}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203996.109112-9262-53834091205965/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203996.52002: _low_level_execute_command(): starting 7491 1727203996.52007: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203996.109112-9262-53834091205965/ > /dev/null 2>&1 && sleep 0' 7491 1727203996.52486: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203996.52490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203996.52524: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203996.52538: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203996.52594: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203996.52600: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203996.52615: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203996.52672: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203996.54456: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203996.54513: stderr chunk (state=3): >>><<< 7491 1727203996.54517: stdout chunk (state=3): >>><<< 7491 1727203996.54534: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203996.54540: handler run complete 7491 1727203996.54566: attempt loop complete, returning result 7491 1727203996.54569: _execute() done 7491 1727203996.54572: dumping result to json 7491 1727203996.54577: done dumping result, returning 7491 1727203996.54585: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-0a4a-ad01-0000000000c8] 7491 1727203996.54590: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000c8 7491 1727203996.54706: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000c8 7491 1727203996.54710: WORKER PROCESS EXITING changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/64", "203.0.113.2/24" ], "auto6": false, "auto_gateway": false, "dhcp4": false, "gateway4": "203.0.113.1", "gateway6": "2001:db8::1" }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 0fe9d42b-408b-41ad-8245-f1fe5397f441 [004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 0fe9d42b-408b-41ad-8245-f1fe5397f441 (not-active) 7491 1727203996.54842: no more pending results, returning what we have 7491 1727203996.54846: results queue empty 7491 1727203996.54847: checking for any_errors_fatal 7491 1727203996.54853: done checking for any_errors_fatal 7491 1727203996.54854: checking for max_fail_percentage 7491 1727203996.54855: done checking for max_fail_percentage 7491 1727203996.54856: checking to see if all hosts have failed and the running result is not ok 7491 1727203996.54857: done checking to see if all hosts have failed 7491 1727203996.54858: getting the remaining hosts for this loop 7491 1727203996.54860: done getting the remaining hosts for this loop 7491 1727203996.54863: getting the next task for host managed-node3 7491 1727203996.54870: done getting next task for host managed-node3 7491 1727203996.54873: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 7491 1727203996.54876: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203996.54886: getting variables 7491 1727203996.54888: in VariableManager get_vars() 7491 1727203996.54936: Calling all_inventory to load vars for managed-node3 7491 1727203996.54939: Calling groups_inventory to load vars for managed-node3 7491 1727203996.54941: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203996.54949: Calling all_plugins_play to load vars for managed-node3 7491 1727203996.54951: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203996.54954: Calling groups_plugins_play to load vars for managed-node3 7491 1727203996.55784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203996.56833: done with get_vars() 7491 1727203996.56849: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:53:16 -0400 (0:00:00.584) 0:00:38.493 ***** 7491 1727203996.56911: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 7491 1727203996.57142: worker is 1 (out of 1 available) 7491 1727203996.57154: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 7491 1727203996.57168: done queuing things up, now waiting for results queue to drain 7491 1727203996.57169: waiting for pending results... 7491 1727203996.57361: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 7491 1727203996.57448: in run() - task 0affcd87-79f5-0a4a-ad01-0000000000c9 7491 1727203996.57460: variable 'ansible_search_path' from source: unknown 7491 1727203996.57463: variable 'ansible_search_path' from source: unknown 7491 1727203996.57497: calling self._execute() 7491 1727203996.57570: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203996.57574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203996.57586: variable 'omit' from source: magic vars 7491 1727203996.57867: variable 'ansible_distribution_major_version' from source: facts 7491 1727203996.57878: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203996.57963: variable 'network_state' from source: role '' defaults 7491 1727203996.57972: Evaluated conditional (network_state != {}): False 7491 1727203996.57976: when evaluation is False, skipping this task 7491 1727203996.57978: _execute() done 7491 1727203996.57981: dumping result to json 7491 1727203996.57984: done dumping result, returning 7491 1727203996.57990: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-0a4a-ad01-0000000000c9] 7491 1727203996.57997: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000c9 7491 1727203996.58087: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000c9 7491 1727203996.58090: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7491 1727203996.58145: no more pending results, returning what we have 7491 1727203996.58150: results queue empty 7491 1727203996.58151: checking for any_errors_fatal 7491 1727203996.58159: done checking for any_errors_fatal 7491 1727203996.58160: checking for max_fail_percentage 7491 1727203996.58162: done checking for max_fail_percentage 7491 1727203996.58163: checking to see if all hosts have failed and the running result is not ok 7491 1727203996.58166: done checking to see if all hosts have failed 7491 1727203996.58167: getting the remaining hosts for this loop 7491 1727203996.58169: done getting the remaining hosts for this loop 7491 1727203996.58173: getting the next task for host managed-node3 7491 1727203996.58178: done getting next task for host managed-node3 7491 1727203996.58182: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7491 1727203996.58185: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203996.58208: getting variables 7491 1727203996.58210: in VariableManager get_vars() 7491 1727203996.58253: Calling all_inventory to load vars for managed-node3 7491 1727203996.58256: Calling groups_inventory to load vars for managed-node3 7491 1727203996.58258: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203996.58267: Calling all_plugins_play to load vars for managed-node3 7491 1727203996.58269: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203996.58271: Calling groups_plugins_play to load vars for managed-node3 7491 1727203996.59059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203996.59999: done with get_vars() 7491 1727203996.60021: done getting variables 7491 1727203996.60069: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:53:16 -0400 (0:00:00.031) 0:00:38.524 ***** 7491 1727203996.60093: entering _queue_task() for managed-node3/debug 7491 1727203996.60334: worker is 1 (out of 1 available) 7491 1727203996.60348: exiting _queue_task() for managed-node3/debug 7491 1727203996.60361: done queuing things up, now waiting for results queue to drain 7491 1727203996.60362: waiting for pending results... 7491 1727203996.60555: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7491 1727203996.60647: in run() - task 0affcd87-79f5-0a4a-ad01-0000000000ca 7491 1727203996.60658: variable 'ansible_search_path' from source: unknown 7491 1727203996.60661: variable 'ansible_search_path' from source: unknown 7491 1727203996.60694: calling self._execute() 7491 1727203996.60775: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203996.60779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203996.60790: variable 'omit' from source: magic vars 7491 1727203996.61083: variable 'ansible_distribution_major_version' from source: facts 7491 1727203996.61093: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203996.61100: variable 'omit' from source: magic vars 7491 1727203996.61145: variable 'omit' from source: magic vars 7491 1727203996.61171: variable 'omit' from source: magic vars 7491 1727203996.61208: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203996.61235: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203996.61255: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203996.61271: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203996.61281: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203996.61305: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203996.61307: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203996.61310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203996.61386: Set connection var ansible_timeout to 10 7491 1727203996.61391: Set connection var ansible_pipelining to False 7491 1727203996.61396: Set connection var ansible_shell_type to sh 7491 1727203996.61401: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203996.61408: Set connection var ansible_shell_executable to /bin/sh 7491 1727203996.61413: Set connection var ansible_connection to ssh 7491 1727203996.61433: variable 'ansible_shell_executable' from source: unknown 7491 1727203996.61436: variable 'ansible_connection' from source: unknown 7491 1727203996.61438: variable 'ansible_module_compression' from source: unknown 7491 1727203996.61441: variable 'ansible_shell_type' from source: unknown 7491 1727203996.61443: variable 'ansible_shell_executable' from source: unknown 7491 1727203996.61445: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203996.61449: variable 'ansible_pipelining' from source: unknown 7491 1727203996.61451: variable 'ansible_timeout' from source: unknown 7491 1727203996.61455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203996.61561: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203996.61574: variable 'omit' from source: magic vars 7491 1727203996.61580: starting attempt loop 7491 1727203996.61583: running the handler 7491 1727203996.61684: variable '__network_connections_result' from source: set_fact 7491 1727203996.61730: handler run complete 7491 1727203996.61743: attempt loop complete, returning result 7491 1727203996.61746: _execute() done 7491 1727203996.61749: dumping result to json 7491 1727203996.61751: done dumping result, returning 7491 1727203996.61759: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-0a4a-ad01-0000000000ca] 7491 1727203996.61766: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000ca 7491 1727203996.61853: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000ca 7491 1727203996.61855: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 0fe9d42b-408b-41ad-8245-f1fe5397f441", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 0fe9d42b-408b-41ad-8245-f1fe5397f441 (not-active)" ] } 7491 1727203996.61929: no more pending results, returning what we have 7491 1727203996.61933: results queue empty 7491 1727203996.61934: checking for any_errors_fatal 7491 1727203996.61942: done checking for any_errors_fatal 7491 1727203996.61942: checking for max_fail_percentage 7491 1727203996.61944: done checking for max_fail_percentage 7491 1727203996.61945: checking to see if all hosts have failed and the running result is not ok 7491 1727203996.61946: done checking to see if all hosts have failed 7491 1727203996.61947: getting the remaining hosts for this loop 7491 1727203996.61949: done getting the remaining hosts for this loop 7491 1727203996.61953: getting the next task for host managed-node3 7491 1727203996.61958: done getting next task for host managed-node3 7491 1727203996.61962: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7491 1727203996.61967: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203996.61978: getting variables 7491 1727203996.61979: in VariableManager get_vars() 7491 1727203996.62030: Calling all_inventory to load vars for managed-node3 7491 1727203996.62033: Calling groups_inventory to load vars for managed-node3 7491 1727203996.62035: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203996.62044: Calling all_plugins_play to load vars for managed-node3 7491 1727203996.62046: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203996.62048: Calling groups_plugins_play to load vars for managed-node3 7491 1727203996.63027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203996.63934: done with get_vars() 7491 1727203996.63952: done getting variables 7491 1727203996.64001: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:53:16 -0400 (0:00:00.039) 0:00:38.564 ***** 7491 1727203996.64026: entering _queue_task() for managed-node3/debug 7491 1727203996.64258: worker is 1 (out of 1 available) 7491 1727203996.64274: exiting _queue_task() for managed-node3/debug 7491 1727203996.64286: done queuing things up, now waiting for results queue to drain 7491 1727203996.64288: waiting for pending results... 7491 1727203996.64488: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7491 1727203996.64585: in run() - task 0affcd87-79f5-0a4a-ad01-0000000000cb 7491 1727203996.64597: variable 'ansible_search_path' from source: unknown 7491 1727203996.64601: variable 'ansible_search_path' from source: unknown 7491 1727203996.64635: calling self._execute() 7491 1727203996.64715: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203996.64721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203996.64736: variable 'omit' from source: magic vars 7491 1727203996.65020: variable 'ansible_distribution_major_version' from source: facts 7491 1727203996.65033: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203996.65038: variable 'omit' from source: magic vars 7491 1727203996.65082: variable 'omit' from source: magic vars 7491 1727203996.65108: variable 'omit' from source: magic vars 7491 1727203996.65146: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203996.65175: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203996.65197: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203996.65209: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203996.65222: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203996.65243: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203996.65246: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203996.65248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203996.65325: Set connection var ansible_timeout to 10 7491 1727203996.65330: Set connection var ansible_pipelining to False 7491 1727203996.65335: Set connection var ansible_shell_type to sh 7491 1727203996.65340: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203996.65347: Set connection var ansible_shell_executable to /bin/sh 7491 1727203996.65352: Set connection var ansible_connection to ssh 7491 1727203996.65373: variable 'ansible_shell_executable' from source: unknown 7491 1727203996.65377: variable 'ansible_connection' from source: unknown 7491 1727203996.65380: variable 'ansible_module_compression' from source: unknown 7491 1727203996.65382: variable 'ansible_shell_type' from source: unknown 7491 1727203996.65384: variable 'ansible_shell_executable' from source: unknown 7491 1727203996.65388: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203996.65390: variable 'ansible_pipelining' from source: unknown 7491 1727203996.65392: variable 'ansible_timeout' from source: unknown 7491 1727203996.65394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203996.65499: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203996.65509: variable 'omit' from source: magic vars 7491 1727203996.65518: starting attempt loop 7491 1727203996.65524: running the handler 7491 1727203996.65561: variable '__network_connections_result' from source: set_fact 7491 1727203996.65631: variable '__network_connections_result' from source: set_fact 7491 1727203996.65734: handler run complete 7491 1727203996.65754: attempt loop complete, returning result 7491 1727203996.65757: _execute() done 7491 1727203996.65759: dumping result to json 7491 1727203996.65766: done dumping result, returning 7491 1727203996.65773: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-0a4a-ad01-0000000000cb] 7491 1727203996.65779: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000cb 7491 1727203996.65879: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000cb 7491 1727203996.65882: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/64", "203.0.113.2/24" ], "auto6": false, "auto_gateway": false, "dhcp4": false, "gateway4": "203.0.113.1", "gateway6": "2001:db8::1" }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 0fe9d42b-408b-41ad-8245-f1fe5397f441\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 0fe9d42b-408b-41ad-8245-f1fe5397f441 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 0fe9d42b-408b-41ad-8245-f1fe5397f441", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 0fe9d42b-408b-41ad-8245-f1fe5397f441 (not-active)" ] } } 7491 1727203996.65984: no more pending results, returning what we have 7491 1727203996.65988: results queue empty 7491 1727203996.65989: checking for any_errors_fatal 7491 1727203996.65994: done checking for any_errors_fatal 7491 1727203996.65994: checking for max_fail_percentage 7491 1727203996.65998: done checking for max_fail_percentage 7491 1727203996.65999: checking to see if all hosts have failed and the running result is not ok 7491 1727203996.66000: done checking to see if all hosts have failed 7491 1727203996.66000: getting the remaining hosts for this loop 7491 1727203996.66002: done getting the remaining hosts for this loop 7491 1727203996.66006: getting the next task for host managed-node3 7491 1727203996.66012: done getting next task for host managed-node3 7491 1727203996.66015: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7491 1727203996.66018: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203996.66028: getting variables 7491 1727203996.66030: in VariableManager get_vars() 7491 1727203996.66088: Calling all_inventory to load vars for managed-node3 7491 1727203996.66091: Calling groups_inventory to load vars for managed-node3 7491 1727203996.66093: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203996.66102: Calling all_plugins_play to load vars for managed-node3 7491 1727203996.66105: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203996.66107: Calling groups_plugins_play to load vars for managed-node3 7491 1727203996.67017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203996.68658: done with get_vars() 7491 1727203996.68694: done getting variables 7491 1727203996.68761: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:53:16 -0400 (0:00:00.047) 0:00:38.611 ***** 7491 1727203996.68801: entering _queue_task() for managed-node3/debug 7491 1727203996.69139: worker is 1 (out of 1 available) 7491 1727203996.69151: exiting _queue_task() for managed-node3/debug 7491 1727203996.69166: done queuing things up, now waiting for results queue to drain 7491 1727203996.69168: waiting for pending results... 7491 1727203996.69481: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7491 1727203996.69635: in run() - task 0affcd87-79f5-0a4a-ad01-0000000000cc 7491 1727203996.69658: variable 'ansible_search_path' from source: unknown 7491 1727203996.69673: variable 'ansible_search_path' from source: unknown 7491 1727203996.69715: calling self._execute() 7491 1727203996.69826: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203996.69840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203996.69857: variable 'omit' from source: magic vars 7491 1727203996.70207: variable 'ansible_distribution_major_version' from source: facts 7491 1727203996.70220: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203996.70307: variable 'network_state' from source: role '' defaults 7491 1727203996.70319: Evaluated conditional (network_state != {}): False 7491 1727203996.70323: when evaluation is False, skipping this task 7491 1727203996.70325: _execute() done 7491 1727203996.70328: dumping result to json 7491 1727203996.70330: done dumping result, returning 7491 1727203996.70335: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-0a4a-ad01-0000000000cc] 7491 1727203996.70342: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000cc 7491 1727203996.70437: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000cc 7491 1727203996.70440: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "network_state != {}" } 7491 1727203996.70490: no more pending results, returning what we have 7491 1727203996.70493: results queue empty 7491 1727203996.70494: checking for any_errors_fatal 7491 1727203996.70502: done checking for any_errors_fatal 7491 1727203996.70503: checking for max_fail_percentage 7491 1727203996.70505: done checking for max_fail_percentage 7491 1727203996.70505: checking to see if all hosts have failed and the running result is not ok 7491 1727203996.70507: done checking to see if all hosts have failed 7491 1727203996.70507: getting the remaining hosts for this loop 7491 1727203996.70509: done getting the remaining hosts for this loop 7491 1727203996.70514: getting the next task for host managed-node3 7491 1727203996.70522: done getting next task for host managed-node3 7491 1727203996.70526: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 7491 1727203996.70529: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203996.70551: getting variables 7491 1727203996.70553: in VariableManager get_vars() 7491 1727203996.70599: Calling all_inventory to load vars for managed-node3 7491 1727203996.70601: Calling groups_inventory to load vars for managed-node3 7491 1727203996.70604: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203996.70612: Calling all_plugins_play to load vars for managed-node3 7491 1727203996.70615: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203996.70620: Calling groups_plugins_play to load vars for managed-node3 7491 1727203996.71590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203996.72512: done with get_vars() 7491 1727203996.72532: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:53:16 -0400 (0:00:00.038) 0:00:38.649 ***** 7491 1727203996.72605: entering _queue_task() for managed-node3/ping 7491 1727203996.72841: worker is 1 (out of 1 available) 7491 1727203996.72854: exiting _queue_task() for managed-node3/ping 7491 1727203996.72869: done queuing things up, now waiting for results queue to drain 7491 1727203996.72871: waiting for pending results... 7491 1727203996.73062: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 7491 1727203996.73149: in run() - task 0affcd87-79f5-0a4a-ad01-0000000000cd 7491 1727203996.73168: variable 'ansible_search_path' from source: unknown 7491 1727203996.73172: variable 'ansible_search_path' from source: unknown 7491 1727203996.73203: calling self._execute() 7491 1727203996.73287: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203996.73291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203996.73301: variable 'omit' from source: magic vars 7491 1727203996.73584: variable 'ansible_distribution_major_version' from source: facts 7491 1727203996.73593: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203996.73603: variable 'omit' from source: magic vars 7491 1727203996.73648: variable 'omit' from source: magic vars 7491 1727203996.73674: variable 'omit' from source: magic vars 7491 1727203996.73712: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203996.73741: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203996.73759: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203996.73773: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203996.73784: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203996.73811: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203996.73815: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203996.73818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203996.73889: Set connection var ansible_timeout to 10 7491 1727203996.73894: Set connection var ansible_pipelining to False 7491 1727203996.73899: Set connection var ansible_shell_type to sh 7491 1727203996.73905: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203996.73913: Set connection var ansible_shell_executable to /bin/sh 7491 1727203996.73924: Set connection var ansible_connection to ssh 7491 1727203996.73941: variable 'ansible_shell_executable' from source: unknown 7491 1727203996.73944: variable 'ansible_connection' from source: unknown 7491 1727203996.73946: variable 'ansible_module_compression' from source: unknown 7491 1727203996.73949: variable 'ansible_shell_type' from source: unknown 7491 1727203996.73951: variable 'ansible_shell_executable' from source: unknown 7491 1727203996.73953: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203996.73955: variable 'ansible_pipelining' from source: unknown 7491 1727203996.73959: variable 'ansible_timeout' from source: unknown 7491 1727203996.73964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203996.74127: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7491 1727203996.74141: variable 'omit' from source: magic vars 7491 1727203996.74145: starting attempt loop 7491 1727203996.74148: running the handler 7491 1727203996.74159: _low_level_execute_command(): starting 7491 1727203996.74168: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203996.74703: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203996.74720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203996.74739: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203996.74752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203996.74763: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203996.74813: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203996.74825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203996.74889: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203996.76514: stdout chunk (state=3): >>>/root <<< 7491 1727203996.76615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203996.76681: stderr chunk (state=3): >>><<< 7491 1727203996.76684: stdout chunk (state=3): >>><<< 7491 1727203996.76705: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203996.76717: _low_level_execute_command(): starting 7491 1727203996.76729: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203996.7670612-9283-195253742598488 `" && echo ansible-tmp-1727203996.7670612-9283-195253742598488="` echo /root/.ansible/tmp/ansible-tmp-1727203996.7670612-9283-195253742598488 `" ) && sleep 0' 7491 1727203996.77200: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203996.77215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203996.77234: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203996.77248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203996.77274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203996.77307: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203996.77322: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203996.77377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203996.79345: stdout chunk (state=3): >>>ansible-tmp-1727203996.7670612-9283-195253742598488=/root/.ansible/tmp/ansible-tmp-1727203996.7670612-9283-195253742598488 <<< 7491 1727203996.79461: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203996.79519: stderr chunk (state=3): >>><<< 7491 1727203996.79527: stdout chunk (state=3): >>><<< 7491 1727203996.79544: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203996.7670612-9283-195253742598488=/root/.ansible/tmp/ansible-tmp-1727203996.7670612-9283-195253742598488 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203996.79589: variable 'ansible_module_compression' from source: unknown 7491 1727203996.79631: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 7491 1727203996.79661: variable 'ansible_facts' from source: unknown 7491 1727203996.79710: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203996.7670612-9283-195253742598488/AnsiballZ_ping.py 7491 1727203996.79823: Sending initial data 7491 1727203996.79833: Sent initial data (151 bytes) 7491 1727203996.80525: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203996.80539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203996.80556: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203996.80569: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203996.80624: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203996.80641: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203996.80691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203996.82514: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203996.82551: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203996.82594: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmph6d9auua /root/.ansible/tmp/ansible-tmp-1727203996.7670612-9283-195253742598488/AnsiballZ_ping.py <<< 7491 1727203996.82631: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203996.83419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203996.83538: stderr chunk (state=3): >>><<< 7491 1727203996.83541: stdout chunk (state=3): >>><<< 7491 1727203996.83560: done transferring module to remote 7491 1727203996.83571: _low_level_execute_command(): starting 7491 1727203996.83576: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203996.7670612-9283-195253742598488/ /root/.ansible/tmp/ansible-tmp-1727203996.7670612-9283-195253742598488/AnsiballZ_ping.py && sleep 0' 7491 1727203996.84049: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203996.84066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203996.84093: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203996.84107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203996.84120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203996.84159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203996.84179: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203996.84228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203996.86015: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203996.86071: stderr chunk (state=3): >>><<< 7491 1727203996.86074: stdout chunk (state=3): >>><<< 7491 1727203996.86091: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203996.86101: _low_level_execute_command(): starting 7491 1727203996.86104: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203996.7670612-9283-195253742598488/AnsiballZ_ping.py && sleep 0' 7491 1727203996.86565: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203996.86579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203996.86607: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203996.86619: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203996.86673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203996.86685: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203996.86745: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203996.99869: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 7491 1727203997.00930: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203997.00996: stderr chunk (state=3): >>><<< 7491 1727203997.00999: stdout chunk (state=3): >>><<< 7491 1727203997.01014: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203997.01040: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203996.7670612-9283-195253742598488/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203997.01047: _low_level_execute_command(): starting 7491 1727203997.01059: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203996.7670612-9283-195253742598488/ > /dev/null 2>&1 && sleep 0' 7491 1727203997.01532: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203997.01547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203997.01568: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203997.01584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203997.01639: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203997.01650: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203997.01700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203997.03522: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203997.03583: stderr chunk (state=3): >>><<< 7491 1727203997.03586: stdout chunk (state=3): >>><<< 7491 1727203997.03603: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203997.03609: handler run complete 7491 1727203997.03624: attempt loop complete, returning result 7491 1727203997.03627: _execute() done 7491 1727203997.03629: dumping result to json 7491 1727203997.03637: done dumping result, returning 7491 1727203997.03647: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-0a4a-ad01-0000000000cd] 7491 1727203997.03653: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000cd 7491 1727203997.03745: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000cd 7491 1727203997.03747: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 7491 1727203997.03830: no more pending results, returning what we have 7491 1727203997.03842: results queue empty 7491 1727203997.03843: checking for any_errors_fatal 7491 1727203997.03850: done checking for any_errors_fatal 7491 1727203997.03851: checking for max_fail_percentage 7491 1727203997.03852: done checking for max_fail_percentage 7491 1727203997.03853: checking to see if all hosts have failed and the running result is not ok 7491 1727203997.03855: done checking to see if all hosts have failed 7491 1727203997.03856: getting the remaining hosts for this loop 7491 1727203997.03858: done getting the remaining hosts for this loop 7491 1727203997.03862: getting the next task for host managed-node3 7491 1727203997.03871: done getting next task for host managed-node3 7491 1727203997.03873: ^ task is: TASK: meta (role_complete) 7491 1727203997.03876: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203997.03887: getting variables 7491 1727203997.03889: in VariableManager get_vars() 7491 1727203997.03937: Calling all_inventory to load vars for managed-node3 7491 1727203997.03939: Calling groups_inventory to load vars for managed-node3 7491 1727203997.03941: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203997.03951: Calling all_plugins_play to load vars for managed-node3 7491 1727203997.03953: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203997.03956: Calling groups_plugins_play to load vars for managed-node3 7491 1727203997.04800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203997.05844: done with get_vars() 7491 1727203997.05861: done getting variables 7491 1727203997.05930: done queuing things up, now waiting for results queue to drain 7491 1727203997.05932: results queue empty 7491 1727203997.05932: checking for any_errors_fatal 7491 1727203997.05934: done checking for any_errors_fatal 7491 1727203997.05934: checking for max_fail_percentage 7491 1727203997.05935: done checking for max_fail_percentage 7491 1727203997.05936: checking to see if all hosts have failed and the running result is not ok 7491 1727203997.05936: done checking to see if all hosts have failed 7491 1727203997.05937: getting the remaining hosts for this loop 7491 1727203997.05937: done getting the remaining hosts for this loop 7491 1727203997.05939: getting the next task for host managed-node3 7491 1727203997.05943: done getting next task for host managed-node3 7491 1727203997.05944: ^ task is: TASK: Include the task 'assert_device_present.yml' 7491 1727203997.05945: ^ state is: HOST STATE: block=2, task=28, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203997.05947: getting variables 7491 1727203997.05948: in VariableManager get_vars() 7491 1727203997.05961: Calling all_inventory to load vars for managed-node3 7491 1727203997.05963: Calling groups_inventory to load vars for managed-node3 7491 1727203997.05966: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203997.05970: Calling all_plugins_play to load vars for managed-node3 7491 1727203997.05971: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203997.05973: Calling groups_plugins_play to load vars for managed-node3 7491 1727203997.06639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203997.07557: done with get_vars() 7491 1727203997.07579: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:108 Tuesday 24 September 2024 14:53:17 -0400 (0:00:00.350) 0:00:39.000 ***** 7491 1727203997.07639: entering _queue_task() for managed-node3/include_tasks 7491 1727203997.07884: worker is 1 (out of 1 available) 7491 1727203997.07898: exiting _queue_task() for managed-node3/include_tasks 7491 1727203997.07912: done queuing things up, now waiting for results queue to drain 7491 1727203997.07913: waiting for pending results... 7491 1727203997.08104: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_present.yml' 7491 1727203997.08172: in run() - task 0affcd87-79f5-0a4a-ad01-0000000000fd 7491 1727203997.08187: variable 'ansible_search_path' from source: unknown 7491 1727203997.08216: calling self._execute() 7491 1727203997.08294: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203997.08299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203997.08310: variable 'omit' from source: magic vars 7491 1727203997.08594: variable 'ansible_distribution_major_version' from source: facts 7491 1727203997.08604: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203997.08614: _execute() done 7491 1727203997.08617: dumping result to json 7491 1727203997.08622: done dumping result, returning 7491 1727203997.08625: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_present.yml' [0affcd87-79f5-0a4a-ad01-0000000000fd] 7491 1727203997.08629: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000fd 7491 1727203997.08720: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000fd 7491 1727203997.08723: WORKER PROCESS EXITING 7491 1727203997.08755: no more pending results, returning what we have 7491 1727203997.08760: in VariableManager get_vars() 7491 1727203997.08823: Calling all_inventory to load vars for managed-node3 7491 1727203997.08826: Calling groups_inventory to load vars for managed-node3 7491 1727203997.08829: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203997.08844: Calling all_plugins_play to load vars for managed-node3 7491 1727203997.08851: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203997.08854: Calling groups_plugins_play to load vars for managed-node3 7491 1727203997.09831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203997.10748: done with get_vars() 7491 1727203997.10767: variable 'ansible_search_path' from source: unknown 7491 1727203997.10779: we have included files to process 7491 1727203997.10780: generating all_blocks data 7491 1727203997.10782: done generating all_blocks data 7491 1727203997.10788: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7491 1727203997.10789: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7491 1727203997.10791: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7491 1727203997.10871: in VariableManager get_vars() 7491 1727203997.10890: done with get_vars() 7491 1727203997.10977: done processing included file 7491 1727203997.10979: iterating over new_blocks loaded from include file 7491 1727203997.10980: in VariableManager get_vars() 7491 1727203997.10995: done with get_vars() 7491 1727203997.10996: filtering new block on tags 7491 1727203997.11010: done filtering new block on tags 7491 1727203997.11012: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node3 7491 1727203997.11017: extending task lists for all hosts with included blocks 7491 1727203997.14890: done extending task lists 7491 1727203997.14892: done processing included files 7491 1727203997.14892: results queue empty 7491 1727203997.14893: checking for any_errors_fatal 7491 1727203997.14894: done checking for any_errors_fatal 7491 1727203997.14895: checking for max_fail_percentage 7491 1727203997.14896: done checking for max_fail_percentage 7491 1727203997.14896: checking to see if all hosts have failed and the running result is not ok 7491 1727203997.14897: done checking to see if all hosts have failed 7491 1727203997.14897: getting the remaining hosts for this loop 7491 1727203997.14898: done getting the remaining hosts for this loop 7491 1727203997.14900: getting the next task for host managed-node3 7491 1727203997.14903: done getting next task for host managed-node3 7491 1727203997.14905: ^ task is: TASK: Include the task 'get_interface_stat.yml' 7491 1727203997.14906: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203997.14908: getting variables 7491 1727203997.14909: in VariableManager get_vars() 7491 1727203997.14929: Calling all_inventory to load vars for managed-node3 7491 1727203997.14931: Calling groups_inventory to load vars for managed-node3 7491 1727203997.14932: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203997.14939: Calling all_plugins_play to load vars for managed-node3 7491 1727203997.14941: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203997.14942: Calling groups_plugins_play to load vars for managed-node3 7491 1727203997.15659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203997.22435: done with get_vars() 7491 1727203997.22466: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:53:17 -0400 (0:00:00.149) 0:00:39.149 ***** 7491 1727203997.22546: entering _queue_task() for managed-node3/include_tasks 7491 1727203997.22884: worker is 1 (out of 1 available) 7491 1727203997.22896: exiting _queue_task() for managed-node3/include_tasks 7491 1727203997.22909: done queuing things up, now waiting for results queue to drain 7491 1727203997.22910: waiting for pending results... 7491 1727203997.23218: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 7491 1727203997.23348: in run() - task 0affcd87-79f5-0a4a-ad01-00000000143a 7491 1727203997.23375: variable 'ansible_search_path' from source: unknown 7491 1727203997.23387: variable 'ansible_search_path' from source: unknown 7491 1727203997.23430: calling self._execute() 7491 1727203997.23541: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203997.23556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203997.23577: variable 'omit' from source: magic vars 7491 1727203997.23977: variable 'ansible_distribution_major_version' from source: facts 7491 1727203997.23993: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203997.24007: _execute() done 7491 1727203997.24014: dumping result to json 7491 1727203997.24021: done dumping result, returning 7491 1727203997.24030: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-0a4a-ad01-00000000143a] 7491 1727203997.24041: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000143a 7491 1727203997.24167: no more pending results, returning what we have 7491 1727203997.24173: in VariableManager get_vars() 7491 1727203997.24234: Calling all_inventory to load vars for managed-node3 7491 1727203997.24237: Calling groups_inventory to load vars for managed-node3 7491 1727203997.24240: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203997.24254: Calling all_plugins_play to load vars for managed-node3 7491 1727203997.24257: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203997.24260: Calling groups_plugins_play to load vars for managed-node3 7491 1727203997.25286: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000143a 7491 1727203997.25290: WORKER PROCESS EXITING 7491 1727203997.25957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203997.27592: done with get_vars() 7491 1727203997.27617: variable 'ansible_search_path' from source: unknown 7491 1727203997.27618: variable 'ansible_search_path' from source: unknown 7491 1727203997.27660: we have included files to process 7491 1727203997.27661: generating all_blocks data 7491 1727203997.27663: done generating all_blocks data 7491 1727203997.27666: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7491 1727203997.27667: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7491 1727203997.27669: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7491 1727203997.27857: done processing included file 7491 1727203997.27859: iterating over new_blocks loaded from include file 7491 1727203997.27861: in VariableManager get_vars() 7491 1727203997.27889: done with get_vars() 7491 1727203997.27891: filtering new block on tags 7491 1727203997.27907: done filtering new block on tags 7491 1727203997.27909: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node3 7491 1727203997.27915: extending task lists for all hosts with included blocks 7491 1727203997.28022: done extending task lists 7491 1727203997.28023: done processing included files 7491 1727203997.28024: results queue empty 7491 1727203997.28025: checking for any_errors_fatal 7491 1727203997.28028: done checking for any_errors_fatal 7491 1727203997.28029: checking for max_fail_percentage 7491 1727203997.28030: done checking for max_fail_percentage 7491 1727203997.28031: checking to see if all hosts have failed and the running result is not ok 7491 1727203997.28032: done checking to see if all hosts have failed 7491 1727203997.28033: getting the remaining hosts for this loop 7491 1727203997.28034: done getting the remaining hosts for this loop 7491 1727203997.28036: getting the next task for host managed-node3 7491 1727203997.28040: done getting next task for host managed-node3 7491 1727203997.28042: ^ task is: TASK: Get stat for interface {{ interface }} 7491 1727203997.28045: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203997.28047: getting variables 7491 1727203997.28048: in VariableManager get_vars() 7491 1727203997.28068: Calling all_inventory to load vars for managed-node3 7491 1727203997.28071: Calling groups_inventory to load vars for managed-node3 7491 1727203997.28073: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203997.28079: Calling all_plugins_play to load vars for managed-node3 7491 1727203997.28081: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203997.28084: Calling groups_plugins_play to load vars for managed-node3 7491 1727203997.29354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203997.30982: done with get_vars() 7491 1727203997.31005: done getting variables 7491 1727203997.31174: variable 'interface' from source: play vars TASK [Get stat for interface veth0] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:53:17 -0400 (0:00:00.086) 0:00:39.235 ***** 7491 1727203997.31207: entering _queue_task() for managed-node3/stat 7491 1727203997.31525: worker is 1 (out of 1 available) 7491 1727203997.31540: exiting _queue_task() for managed-node3/stat 7491 1727203997.31553: done queuing things up, now waiting for results queue to drain 7491 1727203997.31554: waiting for pending results... 7491 1727203997.31854: running TaskExecutor() for managed-node3/TASK: Get stat for interface veth0 7491 1727203997.31987: in run() - task 0affcd87-79f5-0a4a-ad01-0000000016ba 7491 1727203997.32013: variable 'ansible_search_path' from source: unknown 7491 1727203997.32021: variable 'ansible_search_path' from source: unknown 7491 1727203997.32059: calling self._execute() 7491 1727203997.32169: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203997.32181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203997.32196: variable 'omit' from source: magic vars 7491 1727203997.32588: variable 'ansible_distribution_major_version' from source: facts 7491 1727203997.32604: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203997.32614: variable 'omit' from source: magic vars 7491 1727203997.32671: variable 'omit' from source: magic vars 7491 1727203997.32774: variable 'interface' from source: play vars 7491 1727203997.32795: variable 'omit' from source: magic vars 7491 1727203997.32847: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203997.32981: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203997.33007: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203997.33030: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203997.33046: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203997.33086: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203997.33095: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203997.33103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203997.33206: Set connection var ansible_timeout to 10 7491 1727203997.33218: Set connection var ansible_pipelining to False 7491 1727203997.33228: Set connection var ansible_shell_type to sh 7491 1727203997.33237: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203997.33248: Set connection var ansible_shell_executable to /bin/sh 7491 1727203997.33256: Set connection var ansible_connection to ssh 7491 1727203997.33286: variable 'ansible_shell_executable' from source: unknown 7491 1727203997.33293: variable 'ansible_connection' from source: unknown 7491 1727203997.33300: variable 'ansible_module_compression' from source: unknown 7491 1727203997.33307: variable 'ansible_shell_type' from source: unknown 7491 1727203997.33313: variable 'ansible_shell_executable' from source: unknown 7491 1727203997.33319: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203997.33327: variable 'ansible_pipelining' from source: unknown 7491 1727203997.33333: variable 'ansible_timeout' from source: unknown 7491 1727203997.33340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203997.33546: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7491 1727203997.33562: variable 'omit' from source: magic vars 7491 1727203997.33576: starting attempt loop 7491 1727203997.33582: running the handler 7491 1727203997.33604: _low_level_execute_command(): starting 7491 1727203997.33615: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203997.34351: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203997.34373: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203997.34389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203997.34408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203997.34451: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203997.34466: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203997.34482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203997.34500: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203997.34513: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203997.34524: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203997.34536: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203997.34548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203997.34563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203997.34581: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203997.34592: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203997.34605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203997.34685: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203997.34710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203997.34729: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203997.34813: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203997.36490: stdout chunk (state=3): >>>/root <<< 7491 1727203997.36692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203997.36696: stdout chunk (state=3): >>><<< 7491 1727203997.36699: stderr chunk (state=3): >>><<< 7491 1727203997.36835: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203997.36840: _low_level_execute_command(): starting 7491 1727203997.36843: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203997.3672345-9295-67126067031458 `" && echo ansible-tmp-1727203997.3672345-9295-67126067031458="` echo /root/.ansible/tmp/ansible-tmp-1727203997.3672345-9295-67126067031458 `" ) && sleep 0' 7491 1727203997.38086: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203997.38097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203997.38106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203997.38120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203997.38161: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203997.38174: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203997.38184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203997.38198: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203997.38205: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203997.38213: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203997.38219: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203997.38231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203997.38243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203997.38250: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203997.38256: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203997.38267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203997.38341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203997.38358: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203997.38373: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203997.38448: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203997.40401: stdout chunk (state=3): >>>ansible-tmp-1727203997.3672345-9295-67126067031458=/root/.ansible/tmp/ansible-tmp-1727203997.3672345-9295-67126067031458 <<< 7491 1727203997.40586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203997.40602: stderr chunk (state=3): >>><<< 7491 1727203997.40606: stdout chunk (state=3): >>><<< 7491 1727203997.40632: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203997.3672345-9295-67126067031458=/root/.ansible/tmp/ansible-tmp-1727203997.3672345-9295-67126067031458 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203997.40680: variable 'ansible_module_compression' from source: unknown 7491 1727203997.40746: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 7491 1727203997.40784: variable 'ansible_facts' from source: unknown 7491 1727203997.40873: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203997.3672345-9295-67126067031458/AnsiballZ_stat.py 7491 1727203997.41728: Sending initial data 7491 1727203997.41732: Sent initial data (150 bytes) 7491 1727203997.44235: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203997.44239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203997.44331: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203997.44335: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203997.44384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203997.44390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203997.44478: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203997.44605: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203997.44612: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203997.44714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203997.46544: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203997.46580: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203997.46622: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpjfz5apjt /root/.ansible/tmp/ansible-tmp-1727203997.3672345-9295-67126067031458/AnsiballZ_stat.py <<< 7491 1727203997.46657: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203997.48026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203997.48204: stderr chunk (state=3): >>><<< 7491 1727203997.48207: stdout chunk (state=3): >>><<< 7491 1727203997.48209: done transferring module to remote 7491 1727203997.48211: _low_level_execute_command(): starting 7491 1727203997.48213: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203997.3672345-9295-67126067031458/ /root/.ansible/tmp/ansible-tmp-1727203997.3672345-9295-67126067031458/AnsiballZ_stat.py && sleep 0' 7491 1727203997.49092: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203997.49096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203997.49132: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203997.49135: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203997.49138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203997.49818: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203997.49822: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203997.49842: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203997.49905: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203997.51718: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203997.51796: stderr chunk (state=3): >>><<< 7491 1727203997.51800: stdout chunk (state=3): >>><<< 7491 1727203997.51901: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203997.51904: _low_level_execute_command(): starting 7491 1727203997.51907: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203997.3672345-9295-67126067031458/AnsiballZ_stat.py && sleep 0' 7491 1727203997.53159: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203997.53167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203997.53198: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203997.53203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203997.53206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203997.53266: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203997.53886: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203997.53889: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203997.53954: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203997.67223: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 23913, "dev": 21, "nlink": 1, "atime": 1727203989.4813359, "mtime": 1727203989.4813359, "ctime": 1727203989.4813359, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 7491 1727203997.68254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203997.68346: stderr chunk (state=3): >>><<< 7491 1727203997.68350: stdout chunk (state=3): >>><<< 7491 1727203997.68523: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 23913, "dev": 21, "nlink": 1, "atime": 1727203989.4813359, "mtime": 1727203989.4813359, "ctime": 1727203989.4813359, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203997.68527: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203997.3672345-9295-67126067031458/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203997.68536: _low_level_execute_command(): starting 7491 1727203997.68539: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203997.3672345-9295-67126067031458/ > /dev/null 2>&1 && sleep 0' 7491 1727203997.70978: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203997.71288: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203997.71298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203997.71313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203997.71356: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203997.71363: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203997.71377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203997.71391: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203997.71399: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203997.71406: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203997.71414: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203997.71425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203997.71437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203997.71480: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203997.71487: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203997.71496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203997.71578: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203997.71790: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203997.71802: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203997.71875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203997.73748: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203997.73753: stdout chunk (state=3): >>><<< 7491 1727203997.73762: stderr chunk (state=3): >>><<< 7491 1727203997.73782: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203997.73788: handler run complete 7491 1727203997.73851: attempt loop complete, returning result 7491 1727203997.73855: _execute() done 7491 1727203997.73857: dumping result to json 7491 1727203997.73860: done dumping result, returning 7491 1727203997.73880: done running TaskExecutor() for managed-node3/TASK: Get stat for interface veth0 [0affcd87-79f5-0a4a-ad01-0000000016ba] 7491 1727203997.73886: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000016ba 7491 1727203997.74013: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000016ba 7491 1727203997.74015: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "atime": 1727203989.4813359, "block_size": 4096, "blocks": 0, "ctime": 1727203989.4813359, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 23913, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "mode": "0777", "mtime": 1727203989.4813359, "nlink": 1, "path": "/sys/class/net/veth0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 7491 1727203997.74103: no more pending results, returning what we have 7491 1727203997.74107: results queue empty 7491 1727203997.74108: checking for any_errors_fatal 7491 1727203997.74110: done checking for any_errors_fatal 7491 1727203997.74111: checking for max_fail_percentage 7491 1727203997.74112: done checking for max_fail_percentage 7491 1727203997.74113: checking to see if all hosts have failed and the running result is not ok 7491 1727203997.74115: done checking to see if all hosts have failed 7491 1727203997.74115: getting the remaining hosts for this loop 7491 1727203997.74117: done getting the remaining hosts for this loop 7491 1727203997.74121: getting the next task for host managed-node3 7491 1727203997.74130: done getting next task for host managed-node3 7491 1727203997.74132: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 7491 1727203997.74135: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203997.74138: getting variables 7491 1727203997.74140: in VariableManager get_vars() 7491 1727203997.74191: Calling all_inventory to load vars for managed-node3 7491 1727203997.74194: Calling groups_inventory to load vars for managed-node3 7491 1727203997.74196: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203997.74206: Calling all_plugins_play to load vars for managed-node3 7491 1727203997.74209: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203997.74211: Calling groups_plugins_play to load vars for managed-node3 7491 1727203997.77250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203997.81543: done with get_vars() 7491 1727203997.81579: done getting variables 7491 1727203997.81643: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203997.82478: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'veth0'] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:53:17 -0400 (0:00:00.513) 0:00:39.749 ***** 7491 1727203997.82513: entering _queue_task() for managed-node3/assert 7491 1727203997.83256: worker is 1 (out of 1 available) 7491 1727203997.83673: exiting _queue_task() for managed-node3/assert 7491 1727203997.83687: done queuing things up, now waiting for results queue to drain 7491 1727203997.83688: waiting for pending results... 7491 1727203997.84210: running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'veth0' 7491 1727203997.84581: in run() - task 0affcd87-79f5-0a4a-ad01-00000000143b 7491 1727203997.84603: variable 'ansible_search_path' from source: unknown 7491 1727203997.84610: variable 'ansible_search_path' from source: unknown 7491 1727203997.84650: calling self._execute() 7491 1727203997.84765: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203997.84900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203997.84916: variable 'omit' from source: magic vars 7491 1727203997.85690: variable 'ansible_distribution_major_version' from source: facts 7491 1727203997.85775: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203997.85787: variable 'omit' from source: magic vars 7491 1727203997.85827: variable 'omit' from source: magic vars 7491 1727203997.85973: variable 'interface' from source: play vars 7491 1727203997.86110: variable 'omit' from source: magic vars 7491 1727203997.86243: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203997.86288: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203997.86333: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203997.86437: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203997.86452: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203997.86488: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203997.86530: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203997.86538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203997.86754: Set connection var ansible_timeout to 10 7491 1727203997.86767: Set connection var ansible_pipelining to False 7491 1727203997.86777: Set connection var ansible_shell_type to sh 7491 1727203997.86786: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203997.86861: Set connection var ansible_shell_executable to /bin/sh 7491 1727203997.86872: Set connection var ansible_connection to ssh 7491 1727203997.86902: variable 'ansible_shell_executable' from source: unknown 7491 1727203997.86910: variable 'ansible_connection' from source: unknown 7491 1727203997.86917: variable 'ansible_module_compression' from source: unknown 7491 1727203997.86924: variable 'ansible_shell_type' from source: unknown 7491 1727203997.86966: variable 'ansible_shell_executable' from source: unknown 7491 1727203997.86974: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203997.86982: variable 'ansible_pipelining' from source: unknown 7491 1727203997.86989: variable 'ansible_timeout' from source: unknown 7491 1727203997.86996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203997.87243: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203997.87301: variable 'omit' from source: magic vars 7491 1727203997.87310: starting attempt loop 7491 1727203997.87315: running the handler 7491 1727203997.87585: variable 'interface_stat' from source: set_fact 7491 1727203997.87750: Evaluated conditional (interface_stat.stat.exists): True 7491 1727203997.87761: handler run complete 7491 1727203997.87783: attempt loop complete, returning result 7491 1727203997.87790: _execute() done 7491 1727203997.87797: dumping result to json 7491 1727203997.87804: done dumping result, returning 7491 1727203997.87814: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'veth0' [0affcd87-79f5-0a4a-ad01-00000000143b] 7491 1727203997.87825: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000143b ok: [managed-node3] => { "changed": false } MSG: All assertions passed 7491 1727203997.87996: no more pending results, returning what we have 7491 1727203997.88001: results queue empty 7491 1727203997.88002: checking for any_errors_fatal 7491 1727203997.88015: done checking for any_errors_fatal 7491 1727203997.88016: checking for max_fail_percentage 7491 1727203997.88018: done checking for max_fail_percentage 7491 1727203997.88019: checking to see if all hosts have failed and the running result is not ok 7491 1727203997.88020: done checking to see if all hosts have failed 7491 1727203997.88021: getting the remaining hosts for this loop 7491 1727203997.88023: done getting the remaining hosts for this loop 7491 1727203997.88028: getting the next task for host managed-node3 7491 1727203997.88037: done getting next task for host managed-node3 7491 1727203997.88040: ^ task is: TASK: Include the task 'assert_profile_present.yml' 7491 1727203997.88043: ^ state is: HOST STATE: block=2, task=30, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203997.88048: getting variables 7491 1727203997.88050: in VariableManager get_vars() 7491 1727203997.88109: Calling all_inventory to load vars for managed-node3 7491 1727203997.88113: Calling groups_inventory to load vars for managed-node3 7491 1727203997.88116: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203997.88130: Calling all_plugins_play to load vars for managed-node3 7491 1727203997.88133: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203997.88136: Calling groups_plugins_play to load vars for managed-node3 7491 1727203997.89586: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000143b 7491 1727203997.89590: WORKER PROCESS EXITING 7491 1727203997.91577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203997.94668: done with get_vars() 7491 1727203997.94703: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:110 Tuesday 24 September 2024 14:53:17 -0400 (0:00:00.122) 0:00:39.871 ***** 7491 1727203997.94803: entering _queue_task() for managed-node3/include_tasks 7491 1727203997.95629: worker is 1 (out of 1 available) 7491 1727203997.95643: exiting _queue_task() for managed-node3/include_tasks 7491 1727203997.95658: done queuing things up, now waiting for results queue to drain 7491 1727203997.95659: waiting for pending results... 7491 1727203997.96785: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_profile_present.yml' 7491 1727203997.96908: in run() - task 0affcd87-79f5-0a4a-ad01-0000000000fe 7491 1727203997.96932: variable 'ansible_search_path' from source: unknown 7491 1727203997.96986: calling self._execute() 7491 1727203997.97106: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203997.97118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203997.97134: variable 'omit' from source: magic vars 7491 1727203997.97540: variable 'ansible_distribution_major_version' from source: facts 7491 1727203997.97583: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203997.97600: _execute() done 7491 1727203997.97614: dumping result to json 7491 1727203997.97622: done dumping result, returning 7491 1727203997.97633: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_profile_present.yml' [0affcd87-79f5-0a4a-ad01-0000000000fe] 7491 1727203997.97645: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000fe 7491 1727203997.97790: no more pending results, returning what we have 7491 1727203997.97797: in VariableManager get_vars() 7491 1727203997.97869: Calling all_inventory to load vars for managed-node3 7491 1727203997.97872: Calling groups_inventory to load vars for managed-node3 7491 1727203997.97875: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203997.97891: Calling all_plugins_play to load vars for managed-node3 7491 1727203997.97894: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203997.97898: Calling groups_plugins_play to load vars for managed-node3 7491 1727203997.99077: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000fe 7491 1727203997.99082: WORKER PROCESS EXITING 7491 1727203997.99762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203998.03442: done with get_vars() 7491 1727203998.03466: variable 'ansible_search_path' from source: unknown 7491 1727203998.03482: we have included files to process 7491 1727203998.03484: generating all_blocks data 7491 1727203998.03486: done generating all_blocks data 7491 1727203998.03489: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 7491 1727203998.03490: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 7491 1727203998.03611: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 7491 1727203998.03878: in VariableManager get_vars() 7491 1727203998.03967: done with get_vars() 7491 1727203998.04724: done processing included file 7491 1727203998.04726: iterating over new_blocks loaded from include file 7491 1727203998.04728: in VariableManager get_vars() 7491 1727203998.04872: done with get_vars() 7491 1727203998.04874: filtering new block on tags 7491 1727203998.04898: done filtering new block on tags 7491 1727203998.04901: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node3 7491 1727203998.04907: extending task lists for all hosts with included blocks 7491 1727203998.12081: done extending task lists 7491 1727203998.12083: done processing included files 7491 1727203998.12083: results queue empty 7491 1727203998.12084: checking for any_errors_fatal 7491 1727203998.12086: done checking for any_errors_fatal 7491 1727203998.12086: checking for max_fail_percentage 7491 1727203998.12087: done checking for max_fail_percentage 7491 1727203998.12088: checking to see if all hosts have failed and the running result is not ok 7491 1727203998.12088: done checking to see if all hosts have failed 7491 1727203998.12089: getting the remaining hosts for this loop 7491 1727203998.12090: done getting the remaining hosts for this loop 7491 1727203998.12092: getting the next task for host managed-node3 7491 1727203998.12094: done getting next task for host managed-node3 7491 1727203998.12096: ^ task is: TASK: Include the task 'get_profile_stat.yml' 7491 1727203998.12097: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203998.12099: getting variables 7491 1727203998.12100: in VariableManager get_vars() 7491 1727203998.12118: Calling all_inventory to load vars for managed-node3 7491 1727203998.12120: Calling groups_inventory to load vars for managed-node3 7491 1727203998.12122: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203998.12127: Calling all_plugins_play to load vars for managed-node3 7491 1727203998.12128: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203998.12130: Calling groups_plugins_play to load vars for managed-node3 7491 1727203998.12949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203998.15053: done with get_vars() 7491 1727203998.15092: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:53:18 -0400 (0:00:00.203) 0:00:40.075 ***** 7491 1727203998.15195: entering _queue_task() for managed-node3/include_tasks 7491 1727203998.15625: worker is 1 (out of 1 available) 7491 1727203998.15639: exiting _queue_task() for managed-node3/include_tasks 7491 1727203998.15652: done queuing things up, now waiting for results queue to drain 7491 1727203998.15653: waiting for pending results... 7491 1727203998.15842: running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' 7491 1727203998.15919: in run() - task 0affcd87-79f5-0a4a-ad01-0000000016d2 7491 1727203998.15934: variable 'ansible_search_path' from source: unknown 7491 1727203998.15938: variable 'ansible_search_path' from source: unknown 7491 1727203998.15969: calling self._execute() 7491 1727203998.16050: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203998.16054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203998.16062: variable 'omit' from source: magic vars 7491 1727203998.16355: variable 'ansible_distribution_major_version' from source: facts 7491 1727203998.16367: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203998.16373: _execute() done 7491 1727203998.16377: dumping result to json 7491 1727203998.16380: done dumping result, returning 7491 1727203998.16385: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' [0affcd87-79f5-0a4a-ad01-0000000016d2] 7491 1727203998.16392: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000016d2 7491 1727203998.16482: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000016d2 7491 1727203998.16485: WORKER PROCESS EXITING 7491 1727203998.16514: no more pending results, returning what we have 7491 1727203998.16518: in VariableManager get_vars() 7491 1727203998.16581: Calling all_inventory to load vars for managed-node3 7491 1727203998.16584: Calling groups_inventory to load vars for managed-node3 7491 1727203998.16586: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203998.16605: Calling all_plugins_play to load vars for managed-node3 7491 1727203998.16608: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203998.16612: Calling groups_plugins_play to load vars for managed-node3 7491 1727203998.18882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203998.20639: done with get_vars() 7491 1727203998.20660: variable 'ansible_search_path' from source: unknown 7491 1727203998.20662: variable 'ansible_search_path' from source: unknown 7491 1727203998.20691: we have included files to process 7491 1727203998.20692: generating all_blocks data 7491 1727203998.20694: done generating all_blocks data 7491 1727203998.20695: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 7491 1727203998.20695: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 7491 1727203998.20697: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 7491 1727203998.21524: done processing included file 7491 1727203998.21526: iterating over new_blocks loaded from include file 7491 1727203998.21528: in VariableManager get_vars() 7491 1727203998.21554: done with get_vars() 7491 1727203998.21556: filtering new block on tags 7491 1727203998.21582: done filtering new block on tags 7491 1727203998.21585: in VariableManager get_vars() 7491 1727203998.21618: done with get_vars() 7491 1727203998.21620: filtering new block on tags 7491 1727203998.21641: done filtering new block on tags 7491 1727203998.21643: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node3 7491 1727203998.21649: extending task lists for all hosts with included blocks 7491 1727203998.21828: done extending task lists 7491 1727203998.21830: done processing included files 7491 1727203998.21831: results queue empty 7491 1727203998.21831: checking for any_errors_fatal 7491 1727203998.21835: done checking for any_errors_fatal 7491 1727203998.21836: checking for max_fail_percentage 7491 1727203998.21837: done checking for max_fail_percentage 7491 1727203998.21838: checking to see if all hosts have failed and the running result is not ok 7491 1727203998.21839: done checking to see if all hosts have failed 7491 1727203998.21840: getting the remaining hosts for this loop 7491 1727203998.21841: done getting the remaining hosts for this loop 7491 1727203998.21843: getting the next task for host managed-node3 7491 1727203998.21847: done getting next task for host managed-node3 7491 1727203998.21849: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 7491 1727203998.21852: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203998.21855: getting variables 7491 1727203998.21856: in VariableManager get_vars() 7491 1727203998.21876: Calling all_inventory to load vars for managed-node3 7491 1727203998.21879: Calling groups_inventory to load vars for managed-node3 7491 1727203998.21881: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203998.21887: Calling all_plugins_play to load vars for managed-node3 7491 1727203998.21889: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203998.21892: Calling groups_plugins_play to load vars for managed-node3 7491 1727203998.23467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203998.24901: done with get_vars() 7491 1727203998.24919: done getting variables 7491 1727203998.24953: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:53:18 -0400 (0:00:00.097) 0:00:40.173 ***** 7491 1727203998.24977: entering _queue_task() for managed-node3/set_fact 7491 1727203998.25209: worker is 1 (out of 1 available) 7491 1727203998.25221: exiting _queue_task() for managed-node3/set_fact 7491 1727203998.25237: done queuing things up, now waiting for results queue to drain 7491 1727203998.25238: waiting for pending results... 7491 1727203998.25428: running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag 7491 1727203998.25511: in run() - task 0affcd87-79f5-0a4a-ad01-00000000195f 7491 1727203998.25526: variable 'ansible_search_path' from source: unknown 7491 1727203998.25530: variable 'ansible_search_path' from source: unknown 7491 1727203998.25557: calling self._execute() 7491 1727203998.25650: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203998.25654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203998.25662: variable 'omit' from source: magic vars 7491 1727203998.25948: variable 'ansible_distribution_major_version' from source: facts 7491 1727203998.25958: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203998.25966: variable 'omit' from source: magic vars 7491 1727203998.26001: variable 'omit' from source: magic vars 7491 1727203998.26029: variable 'omit' from source: magic vars 7491 1727203998.26066: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203998.26092: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203998.26109: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203998.26125: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203998.26134: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203998.26161: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203998.26166: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203998.26169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203998.26237: Set connection var ansible_timeout to 10 7491 1727203998.26242: Set connection var ansible_pipelining to False 7491 1727203998.26249: Set connection var ansible_shell_type to sh 7491 1727203998.26254: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203998.26263: Set connection var ansible_shell_executable to /bin/sh 7491 1727203998.26265: Set connection var ansible_connection to ssh 7491 1727203998.26286: variable 'ansible_shell_executable' from source: unknown 7491 1727203998.26289: variable 'ansible_connection' from source: unknown 7491 1727203998.26291: variable 'ansible_module_compression' from source: unknown 7491 1727203998.26294: variable 'ansible_shell_type' from source: unknown 7491 1727203998.26296: variable 'ansible_shell_executable' from source: unknown 7491 1727203998.26298: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203998.26302: variable 'ansible_pipelining' from source: unknown 7491 1727203998.26305: variable 'ansible_timeout' from source: unknown 7491 1727203998.26309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203998.26615: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203998.26621: variable 'omit' from source: magic vars 7491 1727203998.26624: starting attempt loop 7491 1727203998.26626: running the handler 7491 1727203998.26628: handler run complete 7491 1727203998.26630: attempt loop complete, returning result 7491 1727203998.26631: _execute() done 7491 1727203998.26633: dumping result to json 7491 1727203998.26635: done dumping result, returning 7491 1727203998.26637: done running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcd87-79f5-0a4a-ad01-00000000195f] 7491 1727203998.26638: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000195f 7491 1727203998.26709: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000195f 7491 1727203998.26712: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 7491 1727203998.26761: no more pending results, returning what we have 7491 1727203998.26765: results queue empty 7491 1727203998.26766: checking for any_errors_fatal 7491 1727203998.26768: done checking for any_errors_fatal 7491 1727203998.26769: checking for max_fail_percentage 7491 1727203998.26770: done checking for max_fail_percentage 7491 1727203998.26771: checking to see if all hosts have failed and the running result is not ok 7491 1727203998.26772: done checking to see if all hosts have failed 7491 1727203998.26773: getting the remaining hosts for this loop 7491 1727203998.26774: done getting the remaining hosts for this loop 7491 1727203998.26783: getting the next task for host managed-node3 7491 1727203998.26789: done getting next task for host managed-node3 7491 1727203998.26792: ^ task is: TASK: Stat profile file 7491 1727203998.26795: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203998.26800: getting variables 7491 1727203998.26803: in VariableManager get_vars() 7491 1727203998.26869: Calling all_inventory to load vars for managed-node3 7491 1727203998.26873: Calling groups_inventory to load vars for managed-node3 7491 1727203998.26875: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203998.26892: Calling all_plugins_play to load vars for managed-node3 7491 1727203998.26896: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203998.26899: Calling groups_plugins_play to load vars for managed-node3 7491 1727203998.28154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203998.29083: done with get_vars() 7491 1727203998.29101: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:53:18 -0400 (0:00:00.041) 0:00:40.215 ***** 7491 1727203998.29172: entering _queue_task() for managed-node3/stat 7491 1727203998.29398: worker is 1 (out of 1 available) 7491 1727203998.29412: exiting _queue_task() for managed-node3/stat 7491 1727203998.29425: done queuing things up, now waiting for results queue to drain 7491 1727203998.29427: waiting for pending results... 7491 1727203998.29617: running TaskExecutor() for managed-node3/TASK: Stat profile file 7491 1727203998.29697: in run() - task 0affcd87-79f5-0a4a-ad01-000000001960 7491 1727203998.29710: variable 'ansible_search_path' from source: unknown 7491 1727203998.29715: variable 'ansible_search_path' from source: unknown 7491 1727203998.29750: calling self._execute() 7491 1727203998.29835: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203998.29840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203998.29850: variable 'omit' from source: magic vars 7491 1727203998.30139: variable 'ansible_distribution_major_version' from source: facts 7491 1727203998.30153: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203998.30157: variable 'omit' from source: magic vars 7491 1727203998.30194: variable 'omit' from source: magic vars 7491 1727203998.30267: variable 'profile' from source: include params 7491 1727203998.30271: variable 'interface' from source: play vars 7491 1727203998.30321: variable 'interface' from source: play vars 7491 1727203998.30333: variable 'omit' from source: magic vars 7491 1727203998.30371: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203998.30400: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203998.30421: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203998.30432: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203998.30442: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203998.30467: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203998.30470: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203998.30473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203998.30544: Set connection var ansible_timeout to 10 7491 1727203998.30549: Set connection var ansible_pipelining to False 7491 1727203998.30554: Set connection var ansible_shell_type to sh 7491 1727203998.30560: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203998.30569: Set connection var ansible_shell_executable to /bin/sh 7491 1727203998.30574: Set connection var ansible_connection to ssh 7491 1727203998.30594: variable 'ansible_shell_executable' from source: unknown 7491 1727203998.30597: variable 'ansible_connection' from source: unknown 7491 1727203998.30600: variable 'ansible_module_compression' from source: unknown 7491 1727203998.30604: variable 'ansible_shell_type' from source: unknown 7491 1727203998.30606: variable 'ansible_shell_executable' from source: unknown 7491 1727203998.30609: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203998.30611: variable 'ansible_pipelining' from source: unknown 7491 1727203998.30613: variable 'ansible_timeout' from source: unknown 7491 1727203998.30615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203998.30768: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7491 1727203998.30776: variable 'omit' from source: magic vars 7491 1727203998.30782: starting attempt loop 7491 1727203998.30784: running the handler 7491 1727203998.30795: _low_level_execute_command(): starting 7491 1727203998.30805: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203998.31341: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203998.31356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203998.31382: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203998.31396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203998.31409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203998.31463: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203998.31477: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203998.31531: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203998.33146: stdout chunk (state=3): >>>/root <<< 7491 1727203998.33248: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203998.33309: stderr chunk (state=3): >>><<< 7491 1727203998.33313: stdout chunk (state=3): >>><<< 7491 1727203998.33340: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203998.33350: _low_level_execute_command(): starting 7491 1727203998.33356: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203998.3333821-9336-46724000891046 `" && echo ansible-tmp-1727203998.3333821-9336-46724000891046="` echo /root/.ansible/tmp/ansible-tmp-1727203998.3333821-9336-46724000891046 `" ) && sleep 0' 7491 1727203998.33828: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203998.33832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203998.33843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203998.33876: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203998.33898: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203998.33946: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203998.33958: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203998.34012: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203998.35839: stdout chunk (state=3): >>>ansible-tmp-1727203998.3333821-9336-46724000891046=/root/.ansible/tmp/ansible-tmp-1727203998.3333821-9336-46724000891046 <<< 7491 1727203998.35949: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203998.36009: stderr chunk (state=3): >>><<< 7491 1727203998.36012: stdout chunk (state=3): >>><<< 7491 1727203998.36031: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203998.3333821-9336-46724000891046=/root/.ansible/tmp/ansible-tmp-1727203998.3333821-9336-46724000891046 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203998.36075: variable 'ansible_module_compression' from source: unknown 7491 1727203998.36125: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 7491 1727203998.36155: variable 'ansible_facts' from source: unknown 7491 1727203998.36221: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203998.3333821-9336-46724000891046/AnsiballZ_stat.py 7491 1727203998.36332: Sending initial data 7491 1727203998.36336: Sent initial data (150 bytes) 7491 1727203998.37043: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203998.37049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203998.37080: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203998.37092: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203998.37144: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203998.37156: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203998.37172: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203998.37214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203998.38884: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203998.38921: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203998.38959: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmppgdq2rh_ /root/.ansible/tmp/ansible-tmp-1727203998.3333821-9336-46724000891046/AnsiballZ_stat.py <<< 7491 1727203998.38994: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203998.39781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203998.39889: stderr chunk (state=3): >>><<< 7491 1727203998.39893: stdout chunk (state=3): >>><<< 7491 1727203998.39912: done transferring module to remote 7491 1727203998.39924: _low_level_execute_command(): starting 7491 1727203998.39927: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203998.3333821-9336-46724000891046/ /root/.ansible/tmp/ansible-tmp-1727203998.3333821-9336-46724000891046/AnsiballZ_stat.py && sleep 0' 7491 1727203998.40387: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203998.40394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203998.40423: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203998.40436: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203998.40445: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203998.40499: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203998.40502: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203998.40520: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203998.40566: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203998.42239: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203998.42295: stderr chunk (state=3): >>><<< 7491 1727203998.42298: stdout chunk (state=3): >>><<< 7491 1727203998.42314: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203998.42320: _low_level_execute_command(): starting 7491 1727203998.42323: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203998.3333821-9336-46724000891046/AnsiballZ_stat.py && sleep 0' 7491 1727203998.42769: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203998.42775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203998.42807: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203998.42821: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203998.42832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203998.42879: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203998.42891: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203998.42946: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203998.55971: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 7491 1727203998.56961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203998.57025: stderr chunk (state=3): >>><<< 7491 1727203998.57029: stdout chunk (state=3): >>><<< 7491 1727203998.57046: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203998.57072: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203998.3333821-9336-46724000891046/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203998.57080: _low_level_execute_command(): starting 7491 1727203998.57085: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203998.3333821-9336-46724000891046/ > /dev/null 2>&1 && sleep 0' 7491 1727203998.57558: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203998.57573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203998.57604: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203998.57621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203998.57667: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203998.57680: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203998.57701: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203998.57735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203998.59511: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203998.59569: stderr chunk (state=3): >>><<< 7491 1727203998.59573: stdout chunk (state=3): >>><<< 7491 1727203998.59588: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203998.59595: handler run complete 7491 1727203998.59614: attempt loop complete, returning result 7491 1727203998.59623: _execute() done 7491 1727203998.59630: dumping result to json 7491 1727203998.59634: done dumping result, returning 7491 1727203998.59643: done running TaskExecutor() for managed-node3/TASK: Stat profile file [0affcd87-79f5-0a4a-ad01-000000001960] 7491 1727203998.59647: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001960 7491 1727203998.59748: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001960 7491 1727203998.59752: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 7491 1727203998.59811: no more pending results, returning what we have 7491 1727203998.59814: results queue empty 7491 1727203998.59815: checking for any_errors_fatal 7491 1727203998.59823: done checking for any_errors_fatal 7491 1727203998.59824: checking for max_fail_percentage 7491 1727203998.59826: done checking for max_fail_percentage 7491 1727203998.59827: checking to see if all hosts have failed and the running result is not ok 7491 1727203998.59828: done checking to see if all hosts have failed 7491 1727203998.59828: getting the remaining hosts for this loop 7491 1727203998.59830: done getting the remaining hosts for this loop 7491 1727203998.59834: getting the next task for host managed-node3 7491 1727203998.59840: done getting next task for host managed-node3 7491 1727203998.59842: ^ task is: TASK: Set NM profile exist flag based on the profile files 7491 1727203998.59846: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203998.59851: getting variables 7491 1727203998.59852: in VariableManager get_vars() 7491 1727203998.59907: Calling all_inventory to load vars for managed-node3 7491 1727203998.59910: Calling groups_inventory to load vars for managed-node3 7491 1727203998.59912: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203998.59923: Calling all_plugins_play to load vars for managed-node3 7491 1727203998.59925: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203998.59928: Calling groups_plugins_play to load vars for managed-node3 7491 1727203998.60908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203998.61829: done with get_vars() 7491 1727203998.61846: done getting variables 7491 1727203998.61894: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:53:18 -0400 (0:00:00.327) 0:00:40.543 ***** 7491 1727203998.61920: entering _queue_task() for managed-node3/set_fact 7491 1727203998.62153: worker is 1 (out of 1 available) 7491 1727203998.62168: exiting _queue_task() for managed-node3/set_fact 7491 1727203998.62182: done queuing things up, now waiting for results queue to drain 7491 1727203998.62184: waiting for pending results... 7491 1727203998.62373: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files 7491 1727203998.62455: in run() - task 0affcd87-79f5-0a4a-ad01-000000001961 7491 1727203998.62468: variable 'ansible_search_path' from source: unknown 7491 1727203998.62472: variable 'ansible_search_path' from source: unknown 7491 1727203998.62503: calling self._execute() 7491 1727203998.62583: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203998.62587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203998.62601: variable 'omit' from source: magic vars 7491 1727203998.62879: variable 'ansible_distribution_major_version' from source: facts 7491 1727203998.62889: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203998.62978: variable 'profile_stat' from source: set_fact 7491 1727203998.62990: Evaluated conditional (profile_stat.stat.exists): False 7491 1727203998.62994: when evaluation is False, skipping this task 7491 1727203998.62997: _execute() done 7491 1727203998.63000: dumping result to json 7491 1727203998.63002: done dumping result, returning 7491 1727203998.63007: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files [0affcd87-79f5-0a4a-ad01-000000001961] 7491 1727203998.63013: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001961 7491 1727203998.63099: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001961 7491 1727203998.63102: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7491 1727203998.63156: no more pending results, returning what we have 7491 1727203998.63159: results queue empty 7491 1727203998.63160: checking for any_errors_fatal 7491 1727203998.63172: done checking for any_errors_fatal 7491 1727203998.63172: checking for max_fail_percentage 7491 1727203998.63174: done checking for max_fail_percentage 7491 1727203998.63175: checking to see if all hosts have failed and the running result is not ok 7491 1727203998.63177: done checking to see if all hosts have failed 7491 1727203998.63178: getting the remaining hosts for this loop 7491 1727203998.63179: done getting the remaining hosts for this loop 7491 1727203998.63183: getting the next task for host managed-node3 7491 1727203998.63188: done getting next task for host managed-node3 7491 1727203998.63191: ^ task is: TASK: Get NM profile info 7491 1727203998.63194: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203998.63198: getting variables 7491 1727203998.63199: in VariableManager get_vars() 7491 1727203998.63245: Calling all_inventory to load vars for managed-node3 7491 1727203998.63254: Calling groups_inventory to load vars for managed-node3 7491 1727203998.63257: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203998.63268: Calling all_plugins_play to load vars for managed-node3 7491 1727203998.63271: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203998.63273: Calling groups_plugins_play to load vars for managed-node3 7491 1727203998.64072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203998.65000: done with get_vars() 7491 1727203998.65016: done getting variables 7491 1727203998.65061: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:53:18 -0400 (0:00:00.031) 0:00:40.574 ***** 7491 1727203998.65086: entering _queue_task() for managed-node3/shell 7491 1727203998.65306: worker is 1 (out of 1 available) 7491 1727203998.65322: exiting _queue_task() for managed-node3/shell 7491 1727203998.65336: done queuing things up, now waiting for results queue to drain 7491 1727203998.65337: waiting for pending results... 7491 1727203998.65520: running TaskExecutor() for managed-node3/TASK: Get NM profile info 7491 1727203998.65602: in run() - task 0affcd87-79f5-0a4a-ad01-000000001962 7491 1727203998.65614: variable 'ansible_search_path' from source: unknown 7491 1727203998.65620: variable 'ansible_search_path' from source: unknown 7491 1727203998.65648: calling self._execute() 7491 1727203998.65727: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203998.65731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203998.65740: variable 'omit' from source: magic vars 7491 1727203998.66021: variable 'ansible_distribution_major_version' from source: facts 7491 1727203998.66028: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203998.66035: variable 'omit' from source: magic vars 7491 1727203998.66067: variable 'omit' from source: magic vars 7491 1727203998.66141: variable 'profile' from source: include params 7491 1727203998.66145: variable 'interface' from source: play vars 7491 1727203998.66192: variable 'interface' from source: play vars 7491 1727203998.66209: variable 'omit' from source: magic vars 7491 1727203998.66244: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203998.66272: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203998.66292: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203998.66308: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203998.66316: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203998.66342: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203998.66345: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203998.66348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203998.66421: Set connection var ansible_timeout to 10 7491 1727203998.66424: Set connection var ansible_pipelining to False 7491 1727203998.66431: Set connection var ansible_shell_type to sh 7491 1727203998.66438: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203998.66443: Set connection var ansible_shell_executable to /bin/sh 7491 1727203998.66448: Set connection var ansible_connection to ssh 7491 1727203998.66467: variable 'ansible_shell_executable' from source: unknown 7491 1727203998.66470: variable 'ansible_connection' from source: unknown 7491 1727203998.66473: variable 'ansible_module_compression' from source: unknown 7491 1727203998.66475: variable 'ansible_shell_type' from source: unknown 7491 1727203998.66478: variable 'ansible_shell_executable' from source: unknown 7491 1727203998.66480: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203998.66484: variable 'ansible_pipelining' from source: unknown 7491 1727203998.66486: variable 'ansible_timeout' from source: unknown 7491 1727203998.66490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203998.66592: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203998.66601: variable 'omit' from source: magic vars 7491 1727203998.66605: starting attempt loop 7491 1727203998.66608: running the handler 7491 1727203998.66617: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203998.66637: _low_level_execute_command(): starting 7491 1727203998.66642: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203998.67178: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203998.67194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203998.67222: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203998.67236: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203998.67282: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203998.67300: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203998.67355: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203998.68929: stdout chunk (state=3): >>>/root <<< 7491 1727203998.69029: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203998.69093: stderr chunk (state=3): >>><<< 7491 1727203998.69096: stdout chunk (state=3): >>><<< 7491 1727203998.69129: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203998.69141: _low_level_execute_command(): starting 7491 1727203998.69148: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203998.6912892-9345-103803155880055 `" && echo ansible-tmp-1727203998.6912892-9345-103803155880055="` echo /root/.ansible/tmp/ansible-tmp-1727203998.6912892-9345-103803155880055 `" ) && sleep 0' 7491 1727203998.69617: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203998.69629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203998.69660: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203998.69675: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203998.69686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203998.69739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203998.69743: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203998.69805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203998.71621: stdout chunk (state=3): >>>ansible-tmp-1727203998.6912892-9345-103803155880055=/root/.ansible/tmp/ansible-tmp-1727203998.6912892-9345-103803155880055 <<< 7491 1727203998.71730: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203998.71788: stderr chunk (state=3): >>><<< 7491 1727203998.71791: stdout chunk (state=3): >>><<< 7491 1727203998.71813: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203998.6912892-9345-103803155880055=/root/.ansible/tmp/ansible-tmp-1727203998.6912892-9345-103803155880055 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203998.71842: variable 'ansible_module_compression' from source: unknown 7491 1727203998.71889: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7491 1727203998.71921: variable 'ansible_facts' from source: unknown 7491 1727203998.71985: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203998.6912892-9345-103803155880055/AnsiballZ_command.py 7491 1727203998.72102: Sending initial data 7491 1727203998.72105: Sent initial data (154 bytes) 7491 1727203998.72803: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203998.72809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203998.72843: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203998.72855: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203998.72908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203998.72920: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203998.72970: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203998.74628: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203998.74668: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203998.74711: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpw3cwcunn /root/.ansible/tmp/ansible-tmp-1727203998.6912892-9345-103803155880055/AnsiballZ_command.py <<< 7491 1727203998.74744: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203998.75538: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203998.75651: stderr chunk (state=3): >>><<< 7491 1727203998.75654: stdout chunk (state=3): >>><<< 7491 1727203998.75673: done transferring module to remote 7491 1727203998.75683: _low_level_execute_command(): starting 7491 1727203998.75688: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203998.6912892-9345-103803155880055/ /root/.ansible/tmp/ansible-tmp-1727203998.6912892-9345-103803155880055/AnsiballZ_command.py && sleep 0' 7491 1727203998.76149: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203998.76158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203998.76190: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203998.76202: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203998.76254: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203998.76275: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203998.76316: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203998.77962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203998.78012: stderr chunk (state=3): >>><<< 7491 1727203998.78016: stdout chunk (state=3): >>><<< 7491 1727203998.78033: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203998.78036: _low_level_execute_command(): starting 7491 1727203998.78040: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203998.6912892-9345-103803155880055/AnsiballZ_command.py && sleep 0' 7491 1727203998.78491: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203998.78503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203998.78597: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203998.78644: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203998.78654: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203998.78725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203998.93700: stdout chunk (state=3): >>> {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-24 14:53:18.917358", "end": "2024-09-24 14:53:18.936106", "delta": "0:00:00.018748", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7491 1727203998.94874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203998.94936: stderr chunk (state=3): >>><<< 7491 1727203998.94940: stdout chunk (state=3): >>><<< 7491 1727203998.94959: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-24 14:53:18.917358", "end": "2024-09-24 14:53:18.936106", "delta": "0:00:00.018748", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203998.94993: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203998.6912892-9345-103803155880055/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203998.95002: _low_level_execute_command(): starting 7491 1727203998.95007: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203998.6912892-9345-103803155880055/ > /dev/null 2>&1 && sleep 0' 7491 1727203998.95486: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203998.95490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203998.95529: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203998.95533: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203998.95540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203998.95584: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203998.95595: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203998.95651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203998.97361: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203998.97421: stderr chunk (state=3): >>><<< 7491 1727203998.97424: stdout chunk (state=3): >>><<< 7491 1727203998.97440: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203998.97446: handler run complete 7491 1727203998.97467: Evaluated conditional (False): False 7491 1727203998.97479: attempt loop complete, returning result 7491 1727203998.97482: _execute() done 7491 1727203998.97485: dumping result to json 7491 1727203998.97487: done dumping result, returning 7491 1727203998.97494: done running TaskExecutor() for managed-node3/TASK: Get NM profile info [0affcd87-79f5-0a4a-ad01-000000001962] 7491 1727203998.97499: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001962 7491 1727203998.97600: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001962 7491 1727203998.97603: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "delta": "0:00:00.018748", "end": "2024-09-24 14:53:18.936106", "rc": 0, "start": "2024-09-24 14:53:18.917358" } STDOUT: veth0 /etc/NetworkManager/system-connections/veth0.nmconnection 7491 1727203998.97679: no more pending results, returning what we have 7491 1727203998.97683: results queue empty 7491 1727203998.97684: checking for any_errors_fatal 7491 1727203998.97691: done checking for any_errors_fatal 7491 1727203998.97692: checking for max_fail_percentage 7491 1727203998.97693: done checking for max_fail_percentage 7491 1727203998.97694: checking to see if all hosts have failed and the running result is not ok 7491 1727203998.97696: done checking to see if all hosts have failed 7491 1727203998.97696: getting the remaining hosts for this loop 7491 1727203998.97698: done getting the remaining hosts for this loop 7491 1727203998.97701: getting the next task for host managed-node3 7491 1727203998.97707: done getting next task for host managed-node3 7491 1727203998.97710: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 7491 1727203998.97715: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203998.97720: getting variables 7491 1727203998.97721: in VariableManager get_vars() 7491 1727203998.97774: Calling all_inventory to load vars for managed-node3 7491 1727203998.97778: Calling groups_inventory to load vars for managed-node3 7491 1727203998.97780: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203998.97791: Calling all_plugins_play to load vars for managed-node3 7491 1727203998.97793: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203998.97796: Calling groups_plugins_play to load vars for managed-node3 7491 1727203998.98747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203998.99667: done with get_vars() 7491 1727203998.99685: done getting variables 7491 1727203998.99732: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:53:18 -0400 (0:00:00.346) 0:00:40.921 ***** 7491 1727203998.99756: entering _queue_task() for managed-node3/set_fact 7491 1727203998.99992: worker is 1 (out of 1 available) 7491 1727203999.00007: exiting _queue_task() for managed-node3/set_fact 7491 1727203999.00020: done queuing things up, now waiting for results queue to drain 7491 1727203999.00021: waiting for pending results... 7491 1727203999.00212: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 7491 1727203999.00295: in run() - task 0affcd87-79f5-0a4a-ad01-000000001963 7491 1727203999.00309: variable 'ansible_search_path' from source: unknown 7491 1727203999.00313: variable 'ansible_search_path' from source: unknown 7491 1727203999.00344: calling self._execute() 7491 1727203999.00429: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203999.00432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203999.00441: variable 'omit' from source: magic vars 7491 1727203999.00748: variable 'ansible_distribution_major_version' from source: facts 7491 1727203999.00758: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203999.00856: variable 'nm_profile_exists' from source: set_fact 7491 1727203999.00871: Evaluated conditional (nm_profile_exists.rc == 0): True 7491 1727203999.00877: variable 'omit' from source: magic vars 7491 1727203999.00913: variable 'omit' from source: magic vars 7491 1727203999.00939: variable 'omit' from source: magic vars 7491 1727203999.00977: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203999.01004: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203999.01026: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203999.01039: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203999.01048: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203999.01075: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203999.01079: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203999.01082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203999.01158: Set connection var ansible_timeout to 10 7491 1727203999.01163: Set connection var ansible_pipelining to False 7491 1727203999.01169: Set connection var ansible_shell_type to sh 7491 1727203999.01175: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203999.01181: Set connection var ansible_shell_executable to /bin/sh 7491 1727203999.01186: Set connection var ansible_connection to ssh 7491 1727203999.01204: variable 'ansible_shell_executable' from source: unknown 7491 1727203999.01207: variable 'ansible_connection' from source: unknown 7491 1727203999.01209: variable 'ansible_module_compression' from source: unknown 7491 1727203999.01211: variable 'ansible_shell_type' from source: unknown 7491 1727203999.01214: variable 'ansible_shell_executable' from source: unknown 7491 1727203999.01216: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203999.01222: variable 'ansible_pipelining' from source: unknown 7491 1727203999.01226: variable 'ansible_timeout' from source: unknown 7491 1727203999.01228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203999.01333: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203999.01345: variable 'omit' from source: magic vars 7491 1727203999.01348: starting attempt loop 7491 1727203999.01351: running the handler 7491 1727203999.01366: handler run complete 7491 1727203999.01377: attempt loop complete, returning result 7491 1727203999.01379: _execute() done 7491 1727203999.01382: dumping result to json 7491 1727203999.01384: done dumping result, returning 7491 1727203999.01391: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcd87-79f5-0a4a-ad01-000000001963] 7491 1727203999.01397: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001963 7491 1727203999.01489: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001963 7491 1727203999.01492: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 7491 1727203999.01546: no more pending results, returning what we have 7491 1727203999.01549: results queue empty 7491 1727203999.01550: checking for any_errors_fatal 7491 1727203999.01568: done checking for any_errors_fatal 7491 1727203999.01569: checking for max_fail_percentage 7491 1727203999.01571: done checking for max_fail_percentage 7491 1727203999.01572: checking to see if all hosts have failed and the running result is not ok 7491 1727203999.01573: done checking to see if all hosts have failed 7491 1727203999.01574: getting the remaining hosts for this loop 7491 1727203999.01576: done getting the remaining hosts for this loop 7491 1727203999.01579: getting the next task for host managed-node3 7491 1727203999.01588: done getting next task for host managed-node3 7491 1727203999.01590: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 7491 1727203999.01594: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203999.01599: getting variables 7491 1727203999.01600: in VariableManager get_vars() 7491 1727203999.01647: Calling all_inventory to load vars for managed-node3 7491 1727203999.01650: Calling groups_inventory to load vars for managed-node3 7491 1727203999.01652: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203999.01663: Calling all_plugins_play to load vars for managed-node3 7491 1727203999.01670: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203999.01673: Calling groups_plugins_play to load vars for managed-node3 7491 1727203999.02507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203999.03559: done with get_vars() 7491 1727203999.03579: done getting variables 7491 1727203999.03627: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203999.03721: variable 'profile' from source: include params 7491 1727203999.03725: variable 'interface' from source: play vars 7491 1727203999.03772: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-veth0] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:53:19 -0400 (0:00:00.040) 0:00:40.961 ***** 7491 1727203999.03803: entering _queue_task() for managed-node3/command 7491 1727203999.04039: worker is 1 (out of 1 available) 7491 1727203999.04053: exiting _queue_task() for managed-node3/command 7491 1727203999.04068: done queuing things up, now waiting for results queue to drain 7491 1727203999.04069: waiting for pending results... 7491 1727203999.04268: running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-veth0 7491 1727203999.04347: in run() - task 0affcd87-79f5-0a4a-ad01-000000001965 7491 1727203999.04357: variable 'ansible_search_path' from source: unknown 7491 1727203999.04361: variable 'ansible_search_path' from source: unknown 7491 1727203999.04394: calling self._execute() 7491 1727203999.04474: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203999.04478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203999.04490: variable 'omit' from source: magic vars 7491 1727203999.04769: variable 'ansible_distribution_major_version' from source: facts 7491 1727203999.04779: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203999.04869: variable 'profile_stat' from source: set_fact 7491 1727203999.04882: Evaluated conditional (profile_stat.stat.exists): False 7491 1727203999.04886: when evaluation is False, skipping this task 7491 1727203999.04889: _execute() done 7491 1727203999.04892: dumping result to json 7491 1727203999.04894: done dumping result, returning 7491 1727203999.04900: done running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-veth0 [0affcd87-79f5-0a4a-ad01-000000001965] 7491 1727203999.04906: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001965 7491 1727203999.04994: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001965 7491 1727203999.04996: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7491 1727203999.05052: no more pending results, returning what we have 7491 1727203999.05055: results queue empty 7491 1727203999.05056: checking for any_errors_fatal 7491 1727203999.05066: done checking for any_errors_fatal 7491 1727203999.05066: checking for max_fail_percentage 7491 1727203999.05068: done checking for max_fail_percentage 7491 1727203999.05069: checking to see if all hosts have failed and the running result is not ok 7491 1727203999.05070: done checking to see if all hosts have failed 7491 1727203999.05071: getting the remaining hosts for this loop 7491 1727203999.05073: done getting the remaining hosts for this loop 7491 1727203999.05076: getting the next task for host managed-node3 7491 1727203999.05083: done getting next task for host managed-node3 7491 1727203999.05086: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 7491 1727203999.05090: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203999.05094: getting variables 7491 1727203999.05095: in VariableManager get_vars() 7491 1727203999.05152: Calling all_inventory to load vars for managed-node3 7491 1727203999.05155: Calling groups_inventory to load vars for managed-node3 7491 1727203999.05157: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203999.05168: Calling all_plugins_play to load vars for managed-node3 7491 1727203999.05171: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203999.05173: Calling groups_plugins_play to load vars for managed-node3 7491 1727203999.05989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203999.06915: done with get_vars() 7491 1727203999.06935: done getting variables 7491 1727203999.06982: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203999.07063: variable 'profile' from source: include params 7491 1727203999.07067: variable 'interface' from source: play vars 7491 1727203999.07109: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-veth0] *********************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:53:19 -0400 (0:00:00.033) 0:00:40.995 ***** 7491 1727203999.07133: entering _queue_task() for managed-node3/set_fact 7491 1727203999.07362: worker is 1 (out of 1 available) 7491 1727203999.07378: exiting _queue_task() for managed-node3/set_fact 7491 1727203999.07390: done queuing things up, now waiting for results queue to drain 7491 1727203999.07392: waiting for pending results... 7491 1727203999.07577: running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-veth0 7491 1727203999.07661: in run() - task 0affcd87-79f5-0a4a-ad01-000000001966 7491 1727203999.07676: variable 'ansible_search_path' from source: unknown 7491 1727203999.07680: variable 'ansible_search_path' from source: unknown 7491 1727203999.07708: calling self._execute() 7491 1727203999.07790: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203999.07794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203999.07802: variable 'omit' from source: magic vars 7491 1727203999.08078: variable 'ansible_distribution_major_version' from source: facts 7491 1727203999.08088: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203999.08177: variable 'profile_stat' from source: set_fact 7491 1727203999.08189: Evaluated conditional (profile_stat.stat.exists): False 7491 1727203999.08192: when evaluation is False, skipping this task 7491 1727203999.08196: _execute() done 7491 1727203999.08198: dumping result to json 7491 1727203999.08200: done dumping result, returning 7491 1727203999.08206: done running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-veth0 [0affcd87-79f5-0a4a-ad01-000000001966] 7491 1727203999.08212: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001966 7491 1727203999.08304: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001966 7491 1727203999.08307: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7491 1727203999.08353: no more pending results, returning what we have 7491 1727203999.08357: results queue empty 7491 1727203999.08358: checking for any_errors_fatal 7491 1727203999.08367: done checking for any_errors_fatal 7491 1727203999.08368: checking for max_fail_percentage 7491 1727203999.08370: done checking for max_fail_percentage 7491 1727203999.08371: checking to see if all hosts have failed and the running result is not ok 7491 1727203999.08372: done checking to see if all hosts have failed 7491 1727203999.08373: getting the remaining hosts for this loop 7491 1727203999.08379: done getting the remaining hosts for this loop 7491 1727203999.08383: getting the next task for host managed-node3 7491 1727203999.08388: done getting next task for host managed-node3 7491 1727203999.08392: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 7491 1727203999.08395: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203999.08400: getting variables 7491 1727203999.08401: in VariableManager get_vars() 7491 1727203999.08449: Calling all_inventory to load vars for managed-node3 7491 1727203999.08451: Calling groups_inventory to load vars for managed-node3 7491 1727203999.08453: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203999.08463: Calling all_plugins_play to load vars for managed-node3 7491 1727203999.08466: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203999.08469: Calling groups_plugins_play to load vars for managed-node3 7491 1727203999.09431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203999.10336: done with get_vars() 7491 1727203999.10352: done getting variables 7491 1727203999.10399: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203999.10482: variable 'profile' from source: include params 7491 1727203999.10485: variable 'interface' from source: play vars 7491 1727203999.10529: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-veth0] ****************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:53:19 -0400 (0:00:00.034) 0:00:41.029 ***** 7491 1727203999.10553: entering _queue_task() for managed-node3/command 7491 1727203999.10780: worker is 1 (out of 1 available) 7491 1727203999.10797: exiting _queue_task() for managed-node3/command 7491 1727203999.10811: done queuing things up, now waiting for results queue to drain 7491 1727203999.10812: waiting for pending results... 7491 1727203999.10993: running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-veth0 7491 1727203999.11076: in run() - task 0affcd87-79f5-0a4a-ad01-000000001967 7491 1727203999.11087: variable 'ansible_search_path' from source: unknown 7491 1727203999.11091: variable 'ansible_search_path' from source: unknown 7491 1727203999.11123: calling self._execute() 7491 1727203999.11211: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203999.11214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203999.11229: variable 'omit' from source: magic vars 7491 1727203999.11510: variable 'ansible_distribution_major_version' from source: facts 7491 1727203999.11521: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203999.11611: variable 'profile_stat' from source: set_fact 7491 1727203999.11627: Evaluated conditional (profile_stat.stat.exists): False 7491 1727203999.11631: when evaluation is False, skipping this task 7491 1727203999.11634: _execute() done 7491 1727203999.11636: dumping result to json 7491 1727203999.11639: done dumping result, returning 7491 1727203999.11645: done running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-veth0 [0affcd87-79f5-0a4a-ad01-000000001967] 7491 1727203999.11651: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001967 7491 1727203999.11740: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001967 7491 1727203999.11744: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7491 1727203999.11796: no more pending results, returning what we have 7491 1727203999.11800: results queue empty 7491 1727203999.11801: checking for any_errors_fatal 7491 1727203999.11811: done checking for any_errors_fatal 7491 1727203999.11812: checking for max_fail_percentage 7491 1727203999.11813: done checking for max_fail_percentage 7491 1727203999.11814: checking to see if all hosts have failed and the running result is not ok 7491 1727203999.11816: done checking to see if all hosts have failed 7491 1727203999.11817: getting the remaining hosts for this loop 7491 1727203999.11821: done getting the remaining hosts for this loop 7491 1727203999.11825: getting the next task for host managed-node3 7491 1727203999.11830: done getting next task for host managed-node3 7491 1727203999.11833: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 7491 1727203999.11836: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203999.11840: getting variables 7491 1727203999.11841: in VariableManager get_vars() 7491 1727203999.11890: Calling all_inventory to load vars for managed-node3 7491 1727203999.11893: Calling groups_inventory to load vars for managed-node3 7491 1727203999.11895: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203999.11905: Calling all_plugins_play to load vars for managed-node3 7491 1727203999.11907: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203999.11910: Calling groups_plugins_play to load vars for managed-node3 7491 1727203999.12723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203999.13659: done with get_vars() 7491 1727203999.13679: done getting variables 7491 1727203999.13728: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203999.13808: variable 'profile' from source: include params 7491 1727203999.13811: variable 'interface' from source: play vars 7491 1727203999.13855: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-veth0] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:53:19 -0400 (0:00:00.033) 0:00:41.062 ***** 7491 1727203999.13880: entering _queue_task() for managed-node3/set_fact 7491 1727203999.14106: worker is 1 (out of 1 available) 7491 1727203999.14122: exiting _queue_task() for managed-node3/set_fact 7491 1727203999.14135: done queuing things up, now waiting for results queue to drain 7491 1727203999.14137: waiting for pending results... 7491 1727203999.14311: running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-veth0 7491 1727203999.14392: in run() - task 0affcd87-79f5-0a4a-ad01-000000001968 7491 1727203999.14404: variable 'ansible_search_path' from source: unknown 7491 1727203999.14408: variable 'ansible_search_path' from source: unknown 7491 1727203999.14436: calling self._execute() 7491 1727203999.14515: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203999.14521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203999.14528: variable 'omit' from source: magic vars 7491 1727203999.14800: variable 'ansible_distribution_major_version' from source: facts 7491 1727203999.14809: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203999.14895: variable 'profile_stat' from source: set_fact 7491 1727203999.14908: Evaluated conditional (profile_stat.stat.exists): False 7491 1727203999.14913: when evaluation is False, skipping this task 7491 1727203999.14916: _execute() done 7491 1727203999.14922: dumping result to json 7491 1727203999.14926: done dumping result, returning 7491 1727203999.14929: done running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-veth0 [0affcd87-79f5-0a4a-ad01-000000001968] 7491 1727203999.14931: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001968 7491 1727203999.15021: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001968 7491 1727203999.15024: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7491 1727203999.15089: no more pending results, returning what we have 7491 1727203999.15092: results queue empty 7491 1727203999.15093: checking for any_errors_fatal 7491 1727203999.15099: done checking for any_errors_fatal 7491 1727203999.15099: checking for max_fail_percentage 7491 1727203999.15101: done checking for max_fail_percentage 7491 1727203999.15102: checking to see if all hosts have failed and the running result is not ok 7491 1727203999.15103: done checking to see if all hosts have failed 7491 1727203999.15103: getting the remaining hosts for this loop 7491 1727203999.15105: done getting the remaining hosts for this loop 7491 1727203999.15108: getting the next task for host managed-node3 7491 1727203999.15116: done getting next task for host managed-node3 7491 1727203999.15120: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 7491 1727203999.15123: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203999.15127: getting variables 7491 1727203999.15128: in VariableManager get_vars() 7491 1727203999.15180: Calling all_inventory to load vars for managed-node3 7491 1727203999.15183: Calling groups_inventory to load vars for managed-node3 7491 1727203999.15185: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203999.15194: Calling all_plugins_play to load vars for managed-node3 7491 1727203999.15196: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203999.15198: Calling groups_plugins_play to load vars for managed-node3 7491 1727203999.16130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203999.17046: done with get_vars() 7491 1727203999.17065: done getting variables 7491 1727203999.17110: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203999.17199: variable 'profile' from source: include params 7491 1727203999.17203: variable 'interface' from source: play vars 7491 1727203999.17247: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'veth0'] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:53:19 -0400 (0:00:00.033) 0:00:41.096 ***** 7491 1727203999.17273: entering _queue_task() for managed-node3/assert 7491 1727203999.17509: worker is 1 (out of 1 available) 7491 1727203999.17526: exiting _queue_task() for managed-node3/assert 7491 1727203999.17539: done queuing things up, now waiting for results queue to drain 7491 1727203999.17541: waiting for pending results... 7491 1727203999.17726: running TaskExecutor() for managed-node3/TASK: Assert that the profile is present - 'veth0' 7491 1727203999.17795: in run() - task 0affcd87-79f5-0a4a-ad01-0000000016d3 7491 1727203999.17806: variable 'ansible_search_path' from source: unknown 7491 1727203999.17809: variable 'ansible_search_path' from source: unknown 7491 1727203999.17840: calling self._execute() 7491 1727203999.17924: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203999.17928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203999.17935: variable 'omit' from source: magic vars 7491 1727203999.18218: variable 'ansible_distribution_major_version' from source: facts 7491 1727203999.18231: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203999.18236: variable 'omit' from source: magic vars 7491 1727203999.18269: variable 'omit' from source: magic vars 7491 1727203999.18346: variable 'profile' from source: include params 7491 1727203999.18350: variable 'interface' from source: play vars 7491 1727203999.18397: variable 'interface' from source: play vars 7491 1727203999.18412: variable 'omit' from source: magic vars 7491 1727203999.18452: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203999.18479: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203999.18496: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203999.18511: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203999.18525: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203999.18550: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203999.18553: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203999.18556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203999.18627: Set connection var ansible_timeout to 10 7491 1727203999.18637: Set connection var ansible_pipelining to False 7491 1727203999.18643: Set connection var ansible_shell_type to sh 7491 1727203999.18649: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203999.18656: Set connection var ansible_shell_executable to /bin/sh 7491 1727203999.18660: Set connection var ansible_connection to ssh 7491 1727203999.18681: variable 'ansible_shell_executable' from source: unknown 7491 1727203999.18684: variable 'ansible_connection' from source: unknown 7491 1727203999.18686: variable 'ansible_module_compression' from source: unknown 7491 1727203999.18689: variable 'ansible_shell_type' from source: unknown 7491 1727203999.18691: variable 'ansible_shell_executable' from source: unknown 7491 1727203999.18693: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203999.18696: variable 'ansible_pipelining' from source: unknown 7491 1727203999.18698: variable 'ansible_timeout' from source: unknown 7491 1727203999.18703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203999.18809: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203999.18821: variable 'omit' from source: magic vars 7491 1727203999.18824: starting attempt loop 7491 1727203999.18827: running the handler 7491 1727203999.18910: variable 'lsr_net_profile_exists' from source: set_fact 7491 1727203999.18913: Evaluated conditional (lsr_net_profile_exists): True 7491 1727203999.18922: handler run complete 7491 1727203999.18932: attempt loop complete, returning result 7491 1727203999.18935: _execute() done 7491 1727203999.18937: dumping result to json 7491 1727203999.18940: done dumping result, returning 7491 1727203999.18947: done running TaskExecutor() for managed-node3/TASK: Assert that the profile is present - 'veth0' [0affcd87-79f5-0a4a-ad01-0000000016d3] 7491 1727203999.18951: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000016d3 7491 1727203999.19041: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000016d3 7491 1727203999.19045: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 7491 1727203999.19115: no more pending results, returning what we have 7491 1727203999.19121: results queue empty 7491 1727203999.19122: checking for any_errors_fatal 7491 1727203999.19132: done checking for any_errors_fatal 7491 1727203999.19132: checking for max_fail_percentage 7491 1727203999.19134: done checking for max_fail_percentage 7491 1727203999.19135: checking to see if all hosts have failed and the running result is not ok 7491 1727203999.19136: done checking to see if all hosts have failed 7491 1727203999.19137: getting the remaining hosts for this loop 7491 1727203999.19139: done getting the remaining hosts for this loop 7491 1727203999.19142: getting the next task for host managed-node3 7491 1727203999.19148: done getting next task for host managed-node3 7491 1727203999.19150: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 7491 1727203999.19152: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203999.19155: getting variables 7491 1727203999.19157: in VariableManager get_vars() 7491 1727203999.19210: Calling all_inventory to load vars for managed-node3 7491 1727203999.19212: Calling groups_inventory to load vars for managed-node3 7491 1727203999.19214: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203999.19226: Calling all_plugins_play to load vars for managed-node3 7491 1727203999.19229: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203999.19231: Calling groups_plugins_play to load vars for managed-node3 7491 1727203999.20077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203999.21134: done with get_vars() 7491 1727203999.21152: done getting variables 7491 1727203999.21200: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203999.21292: variable 'profile' from source: include params 7491 1727203999.21296: variable 'interface' from source: play vars 7491 1727203999.21343: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'veth0'] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:53:19 -0400 (0:00:00.040) 0:00:41.137 ***** 7491 1727203999.21375: entering _queue_task() for managed-node3/assert 7491 1727203999.21616: worker is 1 (out of 1 available) 7491 1727203999.21632: exiting _queue_task() for managed-node3/assert 7491 1727203999.21646: done queuing things up, now waiting for results queue to drain 7491 1727203999.21647: waiting for pending results... 7491 1727203999.21831: running TaskExecutor() for managed-node3/TASK: Assert that the ansible managed comment is present in 'veth0' 7491 1727203999.21905: in run() - task 0affcd87-79f5-0a4a-ad01-0000000016d4 7491 1727203999.21916: variable 'ansible_search_path' from source: unknown 7491 1727203999.21923: variable 'ansible_search_path' from source: unknown 7491 1727203999.21948: calling self._execute() 7491 1727203999.22031: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203999.22035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203999.22044: variable 'omit' from source: magic vars 7491 1727203999.22319: variable 'ansible_distribution_major_version' from source: facts 7491 1727203999.22334: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203999.22341: variable 'omit' from source: magic vars 7491 1727203999.22373: variable 'omit' from source: magic vars 7491 1727203999.22454: variable 'profile' from source: include params 7491 1727203999.22458: variable 'interface' from source: play vars 7491 1727203999.22506: variable 'interface' from source: play vars 7491 1727203999.22525: variable 'omit' from source: magic vars 7491 1727203999.22563: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203999.22590: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203999.22607: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203999.22619: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203999.22630: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203999.22661: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203999.22667: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203999.22669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203999.22736: Set connection var ansible_timeout to 10 7491 1727203999.22742: Set connection var ansible_pipelining to False 7491 1727203999.22754: Set connection var ansible_shell_type to sh 7491 1727203999.22759: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203999.22767: Set connection var ansible_shell_executable to /bin/sh 7491 1727203999.22772: Set connection var ansible_connection to ssh 7491 1727203999.22790: variable 'ansible_shell_executable' from source: unknown 7491 1727203999.22793: variable 'ansible_connection' from source: unknown 7491 1727203999.22795: variable 'ansible_module_compression' from source: unknown 7491 1727203999.22798: variable 'ansible_shell_type' from source: unknown 7491 1727203999.22801: variable 'ansible_shell_executable' from source: unknown 7491 1727203999.22803: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203999.22807: variable 'ansible_pipelining' from source: unknown 7491 1727203999.22809: variable 'ansible_timeout' from source: unknown 7491 1727203999.22814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203999.22916: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203999.22926: variable 'omit' from source: magic vars 7491 1727203999.22931: starting attempt loop 7491 1727203999.22934: running the handler 7491 1727203999.23011: variable 'lsr_net_profile_ansible_managed' from source: set_fact 7491 1727203999.23015: Evaluated conditional (lsr_net_profile_ansible_managed): True 7491 1727203999.23022: handler run complete 7491 1727203999.23033: attempt loop complete, returning result 7491 1727203999.23036: _execute() done 7491 1727203999.23038: dumping result to json 7491 1727203999.23041: done dumping result, returning 7491 1727203999.23048: done running TaskExecutor() for managed-node3/TASK: Assert that the ansible managed comment is present in 'veth0' [0affcd87-79f5-0a4a-ad01-0000000016d4] 7491 1727203999.23053: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000016d4 7491 1727203999.23143: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000016d4 7491 1727203999.23146: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 7491 1727203999.23231: no more pending results, returning what we have 7491 1727203999.23235: results queue empty 7491 1727203999.23236: checking for any_errors_fatal 7491 1727203999.23242: done checking for any_errors_fatal 7491 1727203999.23242: checking for max_fail_percentage 7491 1727203999.23244: done checking for max_fail_percentage 7491 1727203999.23245: checking to see if all hosts have failed and the running result is not ok 7491 1727203999.23246: done checking to see if all hosts have failed 7491 1727203999.23246: getting the remaining hosts for this loop 7491 1727203999.23248: done getting the remaining hosts for this loop 7491 1727203999.23252: getting the next task for host managed-node3 7491 1727203999.23256: done getting next task for host managed-node3 7491 1727203999.23259: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 7491 1727203999.23261: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203999.23266: getting variables 7491 1727203999.23267: in VariableManager get_vars() 7491 1727203999.23317: Calling all_inventory to load vars for managed-node3 7491 1727203999.23322: Calling groups_inventory to load vars for managed-node3 7491 1727203999.23325: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203999.23334: Calling all_plugins_play to load vars for managed-node3 7491 1727203999.23337: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203999.23339: Calling groups_plugins_play to load vars for managed-node3 7491 1727203999.28443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203999.29360: done with get_vars() 7491 1727203999.29382: done getting variables 7491 1727203999.29422: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727203999.29496: variable 'profile' from source: include params 7491 1727203999.29498: variable 'interface' from source: play vars 7491 1727203999.29543: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in veth0] ***************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:53:19 -0400 (0:00:00.081) 0:00:41.219 ***** 7491 1727203999.29569: entering _queue_task() for managed-node3/assert 7491 1727203999.29810: worker is 1 (out of 1 available) 7491 1727203999.29828: exiting _queue_task() for managed-node3/assert 7491 1727203999.29841: done queuing things up, now waiting for results queue to drain 7491 1727203999.29844: waiting for pending results... 7491 1727203999.30030: running TaskExecutor() for managed-node3/TASK: Assert that the fingerprint comment is present in veth0 7491 1727203999.30123: in run() - task 0affcd87-79f5-0a4a-ad01-0000000016d5 7491 1727203999.30144: variable 'ansible_search_path' from source: unknown 7491 1727203999.30147: variable 'ansible_search_path' from source: unknown 7491 1727203999.30178: calling self._execute() 7491 1727203999.30262: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203999.30271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203999.30280: variable 'omit' from source: magic vars 7491 1727203999.30568: variable 'ansible_distribution_major_version' from source: facts 7491 1727203999.30579: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203999.30586: variable 'omit' from source: magic vars 7491 1727203999.30620: variable 'omit' from source: magic vars 7491 1727203999.30696: variable 'profile' from source: include params 7491 1727203999.30700: variable 'interface' from source: play vars 7491 1727203999.30749: variable 'interface' from source: play vars 7491 1727203999.30762: variable 'omit' from source: magic vars 7491 1727203999.30801: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203999.30832: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203999.30849: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203999.30863: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203999.30874: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203999.30901: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203999.30905: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203999.30907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203999.30979: Set connection var ansible_timeout to 10 7491 1727203999.30985: Set connection var ansible_pipelining to False 7491 1727203999.30990: Set connection var ansible_shell_type to sh 7491 1727203999.30997: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203999.31004: Set connection var ansible_shell_executable to /bin/sh 7491 1727203999.31008: Set connection var ansible_connection to ssh 7491 1727203999.31029: variable 'ansible_shell_executable' from source: unknown 7491 1727203999.31032: variable 'ansible_connection' from source: unknown 7491 1727203999.31035: variable 'ansible_module_compression' from source: unknown 7491 1727203999.31037: variable 'ansible_shell_type' from source: unknown 7491 1727203999.31039: variable 'ansible_shell_executable' from source: unknown 7491 1727203999.31042: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203999.31044: variable 'ansible_pipelining' from source: unknown 7491 1727203999.31046: variable 'ansible_timeout' from source: unknown 7491 1727203999.31051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203999.31154: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203999.31163: variable 'omit' from source: magic vars 7491 1727203999.31172: starting attempt loop 7491 1727203999.31175: running the handler 7491 1727203999.31252: variable 'lsr_net_profile_fingerprint' from source: set_fact 7491 1727203999.31256: Evaluated conditional (lsr_net_profile_fingerprint): True 7491 1727203999.31262: handler run complete 7491 1727203999.31277: attempt loop complete, returning result 7491 1727203999.31280: _execute() done 7491 1727203999.31282: dumping result to json 7491 1727203999.31284: done dumping result, returning 7491 1727203999.31291: done running TaskExecutor() for managed-node3/TASK: Assert that the fingerprint comment is present in veth0 [0affcd87-79f5-0a4a-ad01-0000000016d5] 7491 1727203999.31296: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000016d5 7491 1727203999.31391: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000016d5 7491 1727203999.31395: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 7491 1727203999.31448: no more pending results, returning what we have 7491 1727203999.31452: results queue empty 7491 1727203999.31453: checking for any_errors_fatal 7491 1727203999.31462: done checking for any_errors_fatal 7491 1727203999.31462: checking for max_fail_percentage 7491 1727203999.31465: done checking for max_fail_percentage 7491 1727203999.31466: checking to see if all hosts have failed and the running result is not ok 7491 1727203999.31467: done checking to see if all hosts have failed 7491 1727203999.31468: getting the remaining hosts for this loop 7491 1727203999.31470: done getting the remaining hosts for this loop 7491 1727203999.31473: getting the next task for host managed-node3 7491 1727203999.31481: done getting next task for host managed-node3 7491 1727203999.31483: ^ task is: TASK: Show ipv4 routes 7491 1727203999.31485: ^ state is: HOST STATE: block=2, task=32, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203999.31488: getting variables 7491 1727203999.31490: in VariableManager get_vars() 7491 1727203999.31545: Calling all_inventory to load vars for managed-node3 7491 1727203999.31548: Calling groups_inventory to load vars for managed-node3 7491 1727203999.31554: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203999.31566: Calling all_plugins_play to load vars for managed-node3 7491 1727203999.31569: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203999.31572: Calling groups_plugins_play to load vars for managed-node3 7491 1727203999.32480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203999.33407: done with get_vars() 7491 1727203999.33427: done getting variables 7491 1727203999.33473: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show ipv4 routes] ******************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:114 Tuesday 24 September 2024 14:53:19 -0400 (0:00:00.039) 0:00:41.258 ***** 7491 1727203999.33498: entering _queue_task() for managed-node3/command 7491 1727203999.33737: worker is 1 (out of 1 available) 7491 1727203999.33753: exiting _queue_task() for managed-node3/command 7491 1727203999.33768: done queuing things up, now waiting for results queue to drain 7491 1727203999.33770: waiting for pending results... 7491 1727203999.33965: running TaskExecutor() for managed-node3/TASK: Show ipv4 routes 7491 1727203999.34038: in run() - task 0affcd87-79f5-0a4a-ad01-0000000000ff 7491 1727203999.34049: variable 'ansible_search_path' from source: unknown 7491 1727203999.34081: calling self._execute() 7491 1727203999.34167: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203999.34172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203999.34180: variable 'omit' from source: magic vars 7491 1727203999.34477: variable 'ansible_distribution_major_version' from source: facts 7491 1727203999.34490: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203999.34495: variable 'omit' from source: magic vars 7491 1727203999.34514: variable 'omit' from source: magic vars 7491 1727203999.34544: variable 'omit' from source: magic vars 7491 1727203999.34585: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203999.34613: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203999.34633: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203999.34647: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203999.34656: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203999.34689: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203999.34693: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203999.34695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203999.34766: Set connection var ansible_timeout to 10 7491 1727203999.34772: Set connection var ansible_pipelining to False 7491 1727203999.34778: Set connection var ansible_shell_type to sh 7491 1727203999.34784: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203999.34791: Set connection var ansible_shell_executable to /bin/sh 7491 1727203999.34796: Set connection var ansible_connection to ssh 7491 1727203999.34815: variable 'ansible_shell_executable' from source: unknown 7491 1727203999.34821: variable 'ansible_connection' from source: unknown 7491 1727203999.34824: variable 'ansible_module_compression' from source: unknown 7491 1727203999.34827: variable 'ansible_shell_type' from source: unknown 7491 1727203999.34829: variable 'ansible_shell_executable' from source: unknown 7491 1727203999.34832: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203999.34834: variable 'ansible_pipelining' from source: unknown 7491 1727203999.34836: variable 'ansible_timeout' from source: unknown 7491 1727203999.34838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203999.34941: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203999.34950: variable 'omit' from source: magic vars 7491 1727203999.34955: starting attempt loop 7491 1727203999.34958: running the handler 7491 1727203999.34974: _low_level_execute_command(): starting 7491 1727203999.34981: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203999.35528: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203999.35582: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203999.35633: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203999.35644: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203999.35703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203999.37302: stdout chunk (state=3): >>>/root <<< 7491 1727203999.37406: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203999.37469: stderr chunk (state=3): >>><<< 7491 1727203999.37473: stdout chunk (state=3): >>><<< 7491 1727203999.37499: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203999.37511: _low_level_execute_command(): starting 7491 1727203999.37518: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203999.374997-9368-111807929115684 `" && echo ansible-tmp-1727203999.374997-9368-111807929115684="` echo /root/.ansible/tmp/ansible-tmp-1727203999.374997-9368-111807929115684 `" ) && sleep 0' 7491 1727203999.37991: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203999.37999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203999.38028: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203999.38041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203999.38054: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203999.38107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203999.38119: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203999.38173: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203999.39973: stdout chunk (state=3): >>>ansible-tmp-1727203999.374997-9368-111807929115684=/root/.ansible/tmp/ansible-tmp-1727203999.374997-9368-111807929115684 <<< 7491 1727203999.40133: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203999.40136: stdout chunk (state=3): >>><<< 7491 1727203999.40143: stderr chunk (state=3): >>><<< 7491 1727203999.40159: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203999.374997-9368-111807929115684=/root/.ansible/tmp/ansible-tmp-1727203999.374997-9368-111807929115684 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203999.40190: variable 'ansible_module_compression' from source: unknown 7491 1727203999.40236: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7491 1727203999.40271: variable 'ansible_facts' from source: unknown 7491 1727203999.40323: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203999.374997-9368-111807929115684/AnsiballZ_command.py 7491 1727203999.40439: Sending initial data 7491 1727203999.40442: Sent initial data (153 bytes) 7491 1727203999.41136: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203999.41142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203999.41182: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203999.41195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203999.41247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203999.41268: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203999.41307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203999.42982: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 7491 1727203999.42995: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203999.43024: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203999.43063: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpxieixpw8 /root/.ansible/tmp/ansible-tmp-1727203999.374997-9368-111807929115684/AnsiballZ_command.py <<< 7491 1727203999.43099: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203999.43888: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203999.44007: stderr chunk (state=3): >>><<< 7491 1727203999.44011: stdout chunk (state=3): >>><<< 7491 1727203999.44032: done transferring module to remote 7491 1727203999.44040: _low_level_execute_command(): starting 7491 1727203999.44045: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203999.374997-9368-111807929115684/ /root/.ansible/tmp/ansible-tmp-1727203999.374997-9368-111807929115684/AnsiballZ_command.py && sleep 0' 7491 1727203999.44515: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203999.44524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203999.44553: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203999.44567: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203999.44578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203999.44631: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203999.44639: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203999.44691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203999.46349: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203999.46403: stderr chunk (state=3): >>><<< 7491 1727203999.46406: stdout chunk (state=3): >>><<< 7491 1727203999.46424: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203999.46432: _low_level_execute_command(): starting 7491 1727203999.46436: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203999.374997-9368-111807929115684/AnsiballZ_command.py && sleep 0' 7491 1727203999.46890: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203999.46896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203999.46931: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203999.46943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203999.46953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203999.47003: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203999.47011: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203999.47077: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203999.60400: stdout chunk (state=3): >>> {"changed": true, "stdout": "default via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.87 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.87 metric 100 \n203.0.113.0/24 dev veth0 proto kernel scope link src 203.0.113.2 metric 101 ", "stderr": "", "rc": 0, "cmd": ["ip", "route"], "start": "2024-09-24 14:53:19.599667", "end": "2024-09-24 14:53:19.603078", "delta": "0:00:00.003411", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7491 1727203999.61637: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727203999.61642: stdout chunk (state=3): >>><<< 7491 1727203999.61649: stderr chunk (state=3): >>><<< 7491 1727203999.61677: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "default via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.87 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.87 metric 100 \n203.0.113.0/24 dev veth0 proto kernel scope link src 203.0.113.2 metric 101 ", "stderr": "", "rc": 0, "cmd": ["ip", "route"], "start": "2024-09-24 14:53:19.599667", "end": "2024-09-24 14:53:19.603078", "delta": "0:00:00.003411", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727203999.61726: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203999.374997-9368-111807929115684/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727203999.61734: _low_level_execute_command(): starting 7491 1727203999.61740: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203999.374997-9368-111807929115684/ > /dev/null 2>&1 && sleep 0' 7491 1727203999.63246: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203999.63253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203999.63297: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203999.63301: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203999.63303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203999.63376: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203999.63390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203999.63465: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203999.65329: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203999.65333: stdout chunk (state=3): >>><<< 7491 1727203999.65336: stderr chunk (state=3): >>><<< 7491 1727203999.65379: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203999.65387: handler run complete 7491 1727203999.65575: Evaluated conditional (False): False 7491 1727203999.65578: attempt loop complete, returning result 7491 1727203999.65580: _execute() done 7491 1727203999.65582: dumping result to json 7491 1727203999.65584: done dumping result, returning 7491 1727203999.65586: done running TaskExecutor() for managed-node3/TASK: Show ipv4 routes [0affcd87-79f5-0a4a-ad01-0000000000ff] 7491 1727203999.65588: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000ff 7491 1727203999.65670: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000000ff 7491 1727203999.65674: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "route" ], "delta": "0:00:00.003411", "end": "2024-09-24 14:53:19.603078", "rc": 0, "start": "2024-09-24 14:53:19.599667" } STDOUT: default via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.87 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.87 metric 100 203.0.113.0/24 dev veth0 proto kernel scope link src 203.0.113.2 metric 101 7491 1727203999.65756: no more pending results, returning what we have 7491 1727203999.65760: results queue empty 7491 1727203999.65762: checking for any_errors_fatal 7491 1727203999.65769: done checking for any_errors_fatal 7491 1727203999.65769: checking for max_fail_percentage 7491 1727203999.65771: done checking for max_fail_percentage 7491 1727203999.65772: checking to see if all hosts have failed and the running result is not ok 7491 1727203999.65774: done checking to see if all hosts have failed 7491 1727203999.65774: getting the remaining hosts for this loop 7491 1727203999.65776: done getting the remaining hosts for this loop 7491 1727203999.65780: getting the next task for host managed-node3 7491 1727203999.65787: done getting next task for host managed-node3 7491 1727203999.65789: ^ task is: TASK: Assert default ipv4 route is absent 7491 1727203999.65791: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203999.65795: getting variables 7491 1727203999.65796: in VariableManager get_vars() 7491 1727203999.65854: Calling all_inventory to load vars for managed-node3 7491 1727203999.65857: Calling groups_inventory to load vars for managed-node3 7491 1727203999.65860: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203999.65874: Calling all_plugins_play to load vars for managed-node3 7491 1727203999.65877: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203999.65881: Calling groups_plugins_play to load vars for managed-node3 7491 1727203999.69567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203999.72950: done with get_vars() 7491 1727203999.72987: done getting variables 7491 1727203999.73056: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert default ipv4 route is absent] ************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:118 Tuesday 24 September 2024 14:53:19 -0400 (0:00:00.395) 0:00:41.654 ***** 7491 1727203999.73089: entering _queue_task() for managed-node3/assert 7491 1727203999.73436: worker is 1 (out of 1 available) 7491 1727203999.73449: exiting _queue_task() for managed-node3/assert 7491 1727203999.73462: done queuing things up, now waiting for results queue to drain 7491 1727203999.73465: waiting for pending results... 7491 1727203999.73782: running TaskExecutor() for managed-node3/TASK: Assert default ipv4 route is absent 7491 1727203999.73897: in run() - task 0affcd87-79f5-0a4a-ad01-000000000100 7491 1727203999.73929: variable 'ansible_search_path' from source: unknown 7491 1727203999.73994: calling self._execute() 7491 1727203999.74180: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203999.74208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203999.74241: variable 'omit' from source: magic vars 7491 1727203999.74697: variable 'ansible_distribution_major_version' from source: facts 7491 1727203999.74720: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203999.74734: variable 'omit' from source: magic vars 7491 1727203999.74763: variable 'omit' from source: magic vars 7491 1727203999.74839: variable 'omit' from source: magic vars 7491 1727203999.74929: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203999.74974: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203999.75010: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203999.75043: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203999.75065: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203999.75106: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203999.75124: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203999.75138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203999.75260: Set connection var ansible_timeout to 10 7491 1727203999.75277: Set connection var ansible_pipelining to False 7491 1727203999.75288: Set connection var ansible_shell_type to sh 7491 1727203999.75304: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203999.75316: Set connection var ansible_shell_executable to /bin/sh 7491 1727203999.75335: Set connection var ansible_connection to ssh 7491 1727203999.75372: variable 'ansible_shell_executable' from source: unknown 7491 1727203999.75380: variable 'ansible_connection' from source: unknown 7491 1727203999.75386: variable 'ansible_module_compression' from source: unknown 7491 1727203999.75393: variable 'ansible_shell_type' from source: unknown 7491 1727203999.75398: variable 'ansible_shell_executable' from source: unknown 7491 1727203999.75405: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203999.75414: variable 'ansible_pipelining' from source: unknown 7491 1727203999.75423: variable 'ansible_timeout' from source: unknown 7491 1727203999.75430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203999.75592: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203999.75612: variable 'omit' from source: magic vars 7491 1727203999.75628: starting attempt loop 7491 1727203999.75636: running the handler 7491 1727203999.76128: variable '__test_str' from source: task vars 7491 1727203999.76229: variable 'interface' from source: play vars 7491 1727203999.76233: variable 'ipv4_routes' from source: set_fact 7491 1727203999.76236: Evaluated conditional (__test_str not in ipv4_routes.stdout): True 7491 1727203999.76239: handler run complete 7491 1727203999.76256: attempt loop complete, returning result 7491 1727203999.76260: _execute() done 7491 1727203999.76263: dumping result to json 7491 1727203999.76266: done dumping result, returning 7491 1727203999.76274: done running TaskExecutor() for managed-node3/TASK: Assert default ipv4 route is absent [0affcd87-79f5-0a4a-ad01-000000000100] 7491 1727203999.76279: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000100 7491 1727203999.76385: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000100 7491 1727203999.76388: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 7491 1727203999.76438: no more pending results, returning what we have 7491 1727203999.76441: results queue empty 7491 1727203999.76442: checking for any_errors_fatal 7491 1727203999.76452: done checking for any_errors_fatal 7491 1727203999.76453: checking for max_fail_percentage 7491 1727203999.76455: done checking for max_fail_percentage 7491 1727203999.76456: checking to see if all hosts have failed and the running result is not ok 7491 1727203999.76457: done checking to see if all hosts have failed 7491 1727203999.76457: getting the remaining hosts for this loop 7491 1727203999.76459: done getting the remaining hosts for this loop 7491 1727203999.76463: getting the next task for host managed-node3 7491 1727203999.76469: done getting next task for host managed-node3 7491 1727203999.76472: ^ task is: TASK: Get ipv6 routes 7491 1727203999.76473: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727203999.76477: getting variables 7491 1727203999.76478: in VariableManager get_vars() 7491 1727203999.76530: Calling all_inventory to load vars for managed-node3 7491 1727203999.76533: Calling groups_inventory to load vars for managed-node3 7491 1727203999.76535: Calling all_plugins_inventory to load vars for managed-node3 7491 1727203999.76545: Calling all_plugins_play to load vars for managed-node3 7491 1727203999.76548: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727203999.76550: Calling groups_plugins_play to load vars for managed-node3 7491 1727203999.78274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727203999.80597: done with get_vars() 7491 1727203999.80636: done getting variables 7491 1727203999.80706: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get ipv6 routes] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:123 Tuesday 24 September 2024 14:53:19 -0400 (0:00:00.076) 0:00:41.731 ***** 7491 1727203999.80739: entering _queue_task() for managed-node3/command 7491 1727203999.81570: worker is 1 (out of 1 available) 7491 1727203999.81589: exiting _queue_task() for managed-node3/command 7491 1727203999.81604: done queuing things up, now waiting for results queue to drain 7491 1727203999.81606: waiting for pending results... 7491 1727203999.82693: running TaskExecutor() for managed-node3/TASK: Get ipv6 routes 7491 1727203999.82913: in run() - task 0affcd87-79f5-0a4a-ad01-000000000101 7491 1727203999.82940: variable 'ansible_search_path' from source: unknown 7491 1727203999.83103: calling self._execute() 7491 1727203999.83237: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203999.83324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203999.83342: variable 'omit' from source: magic vars 7491 1727203999.85577: variable 'ansible_distribution_major_version' from source: facts 7491 1727203999.85600: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727203999.85614: variable 'omit' from source: magic vars 7491 1727203999.85650: variable 'omit' from source: magic vars 7491 1727203999.85702: variable 'omit' from source: magic vars 7491 1727203999.86075: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727203999.86126: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727203999.86153: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727203999.86176: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203999.86267: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727203999.86302: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727203999.86309: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203999.86483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203999.86659: Set connection var ansible_timeout to 10 7491 1727203999.86673: Set connection var ansible_pipelining to False 7491 1727203999.86682: Set connection var ansible_shell_type to sh 7491 1727203999.86792: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727203999.86825: Set connection var ansible_shell_executable to /bin/sh 7491 1727203999.86837: Set connection var ansible_connection to ssh 7491 1727203999.86868: variable 'ansible_shell_executable' from source: unknown 7491 1727203999.86878: variable 'ansible_connection' from source: unknown 7491 1727203999.86893: variable 'ansible_module_compression' from source: unknown 7491 1727203999.86901: variable 'ansible_shell_type' from source: unknown 7491 1727203999.86909: variable 'ansible_shell_executable' from source: unknown 7491 1727203999.86916: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727203999.86929: variable 'ansible_pipelining' from source: unknown 7491 1727203999.86936: variable 'ansible_timeout' from source: unknown 7491 1727203999.86944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727203999.87099: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727203999.87125: variable 'omit' from source: magic vars 7491 1727203999.87136: starting attempt loop 7491 1727203999.87143: running the handler 7491 1727203999.87163: _low_level_execute_command(): starting 7491 1727203999.87180: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727203999.88693: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727203999.88705: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203999.88716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203999.88733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203999.88773: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203999.88781: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203999.88791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203999.88806: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203999.88815: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203999.88822: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203999.88832: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203999.88844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203999.88858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203999.88868: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203999.88876: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203999.88887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203999.88980: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203999.88996: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203999.89004: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203999.89079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203999.90680: stdout chunk (state=3): >>>/root <<< 7491 1727203999.90871: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203999.90875: stdout chunk (state=3): >>><<< 7491 1727203999.90877: stderr chunk (state=3): >>><<< 7491 1727203999.90971: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203999.90975: _low_level_execute_command(): starting 7491 1727203999.90978: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203999.9090805-9387-192575377388032 `" && echo ansible-tmp-1727203999.9090805-9387-192575377388032="` echo /root/.ansible/tmp/ansible-tmp-1727203999.9090805-9387-192575377388032 `" ) && sleep 0' 7491 1727203999.91763: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203999.91768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203999.91804: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203999.91808: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203999.91811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203999.91852: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203999.91863: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203999.91915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203999.93741: stdout chunk (state=3): >>>ansible-tmp-1727203999.9090805-9387-192575377388032=/root/.ansible/tmp/ansible-tmp-1727203999.9090805-9387-192575377388032 <<< 7491 1727203999.93888: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203999.93959: stderr chunk (state=3): >>><<< 7491 1727203999.93962: stdout chunk (state=3): >>><<< 7491 1727203999.93986: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203999.9090805-9387-192575377388032=/root/.ansible/tmp/ansible-tmp-1727203999.9090805-9387-192575377388032 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727203999.94012: variable 'ansible_module_compression' from source: unknown 7491 1727203999.94057: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7491 1727203999.94094: variable 'ansible_facts' from source: unknown 7491 1727203999.94146: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203999.9090805-9387-192575377388032/AnsiballZ_command.py 7491 1727203999.94254: Sending initial data 7491 1727203999.94258: Sent initial data (154 bytes) 7491 1727203999.94937: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203999.94943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203999.94977: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727203999.94984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203999.94990: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727203999.94995: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727203999.95000: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727203999.95005: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203999.95014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727203999.95022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203999.95027: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203999.95034: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727203999.95039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203999.95095: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203999.95109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203999.95116: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203999.95179: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727203999.96872: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727203999.96905: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727203999.96947: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpbwr260zb /root/.ansible/tmp/ansible-tmp-1727203999.9090805-9387-192575377388032/AnsiballZ_command.py <<< 7491 1727203999.96988: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727203999.97901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727203999.98011: stderr chunk (state=3): >>><<< 7491 1727203999.98014: stdout chunk (state=3): >>><<< 7491 1727203999.98032: done transferring module to remote 7491 1727203999.98041: _low_level_execute_command(): starting 7491 1727203999.98046: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203999.9090805-9387-192575377388032/ /root/.ansible/tmp/ansible-tmp-1727203999.9090805-9387-192575377388032/AnsiballZ_command.py && sleep 0' 7491 1727203999.98493: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203999.98499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727203999.98528: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727203999.98537: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727203999.98545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203999.98563: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7491 1727203999.98572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727203999.98580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727203999.98627: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727203999.98649: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727203999.98652: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727203999.98697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204000.00506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204000.00509: stdout chunk (state=3): >>><<< 7491 1727204000.00522: stderr chunk (state=3): >>><<< 7491 1727204000.00545: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204000.00548: _low_level_execute_command(): starting 7491 1727204000.00553: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727203999.9090805-9387-192575377388032/AnsiballZ_command.py && sleep 0' 7491 1727204000.01175: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204000.01180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204000.01288: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204000.01292: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727204000.01294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204000.01297: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727204000.01299: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727204000.01301: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727204000.01303: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204000.01305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204000.01452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204000.01456: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204000.01458: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727204000.01460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204000.01467: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204000.01497: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204000.01503: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204000.01518: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204000.15015: stdout chunk (state=3): >>> {"changed": true, "stdout": "::1 dev lo proto kernel metric 256 pref medium\n2001:db8::/64 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nfe80::/64 dev peerveth0 proto kernel metric 256 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-24 14:53:20.145919", "end": "2024-09-24 14:53:20.149316", "delta": "0:00:00.003397", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7491 1727204000.16199: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727204000.16256: stderr chunk (state=3): >>><<< 7491 1727204000.16260: stdout chunk (state=3): >>><<< 7491 1727204000.16282: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "::1 dev lo proto kernel metric 256 pref medium\n2001:db8::/64 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nfe80::/64 dev peerveth0 proto kernel metric 256 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-24 14:53:20.145919", "end": "2024-09-24 14:53:20.149316", "delta": "0:00:00.003397", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727204000.16314: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203999.9090805-9387-192575377388032/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727204000.16325: _low_level_execute_command(): starting 7491 1727204000.16328: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203999.9090805-9387-192575377388032/ > /dev/null 2>&1 && sleep 0' 7491 1727204000.16801: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204000.16805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204000.16840: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204000.16843: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204000.16901: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204000.16905: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204000.16907: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204000.16954: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204000.18703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204000.18759: stderr chunk (state=3): >>><<< 7491 1727204000.18763: stdout chunk (state=3): >>><<< 7491 1727204000.18782: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204000.18788: handler run complete 7491 1727204000.18809: Evaluated conditional (False): False 7491 1727204000.18818: attempt loop complete, returning result 7491 1727204000.18823: _execute() done 7491 1727204000.18825: dumping result to json 7491 1727204000.18831: done dumping result, returning 7491 1727204000.18838: done running TaskExecutor() for managed-node3/TASK: Get ipv6 routes [0affcd87-79f5-0a4a-ad01-000000000101] 7491 1727204000.18844: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000101 7491 1727204000.18946: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000101 7491 1727204000.18949: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "-6", "route" ], "delta": "0:00:00.003397", "end": "2024-09-24 14:53:20.149316", "rc": 0, "start": "2024-09-24 14:53:20.145919" } STDOUT: ::1 dev lo proto kernel metric 256 pref medium 2001:db8::/64 dev veth0 proto kernel metric 101 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium fe80::/64 dev peerveth0 proto kernel metric 256 pref medium fe80::/64 dev veth0 proto kernel metric 1024 pref medium 7491 1727204000.19022: no more pending results, returning what we have 7491 1727204000.19025: results queue empty 7491 1727204000.19026: checking for any_errors_fatal 7491 1727204000.19034: done checking for any_errors_fatal 7491 1727204000.19035: checking for max_fail_percentage 7491 1727204000.19037: done checking for max_fail_percentage 7491 1727204000.19038: checking to see if all hosts have failed and the running result is not ok 7491 1727204000.19039: done checking to see if all hosts have failed 7491 1727204000.19040: getting the remaining hosts for this loop 7491 1727204000.19042: done getting the remaining hosts for this loop 7491 1727204000.19045: getting the next task for host managed-node3 7491 1727204000.19050: done getting next task for host managed-node3 7491 1727204000.19052: ^ task is: TASK: Assert default ipv6 route is absent 7491 1727204000.19054: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204000.19057: getting variables 7491 1727204000.19059: in VariableManager get_vars() 7491 1727204000.19111: Calling all_inventory to load vars for managed-node3 7491 1727204000.19114: Calling groups_inventory to load vars for managed-node3 7491 1727204000.19116: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204000.19126: Calling all_plugins_play to load vars for managed-node3 7491 1727204000.19128: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204000.19131: Calling groups_plugins_play to load vars for managed-node3 7491 1727204000.19953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204000.20881: done with get_vars() 7491 1727204000.20897: done getting variables 7491 1727204000.20943: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert default ipv6 route is absent] ************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:127 Tuesday 24 September 2024 14:53:20 -0400 (0:00:00.402) 0:00:42.133 ***** 7491 1727204000.20968: entering _queue_task() for managed-node3/assert 7491 1727204000.21183: worker is 1 (out of 1 available) 7491 1727204000.21198: exiting _queue_task() for managed-node3/assert 7491 1727204000.21211: done queuing things up, now waiting for results queue to drain 7491 1727204000.21212: waiting for pending results... 7491 1727204000.21398: running TaskExecutor() for managed-node3/TASK: Assert default ipv6 route is absent 7491 1727204000.21452: in run() - task 0affcd87-79f5-0a4a-ad01-000000000102 7491 1727204000.21463: variable 'ansible_search_path' from source: unknown 7491 1727204000.21496: calling self._execute() 7491 1727204000.21587: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204000.21591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204000.21600: variable 'omit' from source: magic vars 7491 1727204000.21895: variable 'ansible_distribution_major_version' from source: facts 7491 1727204000.21905: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204000.21990: variable 'network_provider' from source: set_fact 7491 1727204000.21997: Evaluated conditional (network_provider == "nm"): True 7491 1727204000.22004: variable 'omit' from source: magic vars 7491 1727204000.22021: variable 'omit' from source: magic vars 7491 1727204000.22055: variable 'omit' from source: magic vars 7491 1727204000.22089: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727204000.22119: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727204000.22140: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727204000.22159: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204000.22170: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204000.22195: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727204000.22198: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204000.22200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204000.22273: Set connection var ansible_timeout to 10 7491 1727204000.22278: Set connection var ansible_pipelining to False 7491 1727204000.22283: Set connection var ansible_shell_type to sh 7491 1727204000.22289: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727204000.22295: Set connection var ansible_shell_executable to /bin/sh 7491 1727204000.22300: Set connection var ansible_connection to ssh 7491 1727204000.22321: variable 'ansible_shell_executable' from source: unknown 7491 1727204000.22324: variable 'ansible_connection' from source: unknown 7491 1727204000.22327: variable 'ansible_module_compression' from source: unknown 7491 1727204000.22329: variable 'ansible_shell_type' from source: unknown 7491 1727204000.22331: variable 'ansible_shell_executable' from source: unknown 7491 1727204000.22333: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204000.22335: variable 'ansible_pipelining' from source: unknown 7491 1727204000.22337: variable 'ansible_timeout' from source: unknown 7491 1727204000.22339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204000.22446: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727204000.22456: variable 'omit' from source: magic vars 7491 1727204000.22461: starting attempt loop 7491 1727204000.22465: running the handler 7491 1727204000.22569: variable '__test_str' from source: task vars 7491 1727204000.22622: variable 'interface' from source: play vars 7491 1727204000.22628: variable 'ipv6_route' from source: set_fact 7491 1727204000.22639: Evaluated conditional (__test_str not in ipv6_route.stdout): True 7491 1727204000.22644: handler run complete 7491 1727204000.22658: attempt loop complete, returning result 7491 1727204000.22661: _execute() done 7491 1727204000.22666: dumping result to json 7491 1727204000.22669: done dumping result, returning 7491 1727204000.22674: done running TaskExecutor() for managed-node3/TASK: Assert default ipv6 route is absent [0affcd87-79f5-0a4a-ad01-000000000102] 7491 1727204000.22679: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000102 7491 1727204000.22776: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000102 7491 1727204000.22779: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 7491 1727204000.22833: no more pending results, returning what we have 7491 1727204000.22837: results queue empty 7491 1727204000.22838: checking for any_errors_fatal 7491 1727204000.22847: done checking for any_errors_fatal 7491 1727204000.22847: checking for max_fail_percentage 7491 1727204000.22849: done checking for max_fail_percentage 7491 1727204000.22850: checking to see if all hosts have failed and the running result is not ok 7491 1727204000.22851: done checking to see if all hosts have failed 7491 1727204000.22852: getting the remaining hosts for this loop 7491 1727204000.22854: done getting the remaining hosts for this loop 7491 1727204000.22857: getting the next task for host managed-node3 7491 1727204000.22862: done getting next task for host managed-node3 7491 1727204000.22866: ^ task is: TASK: TEARDOWN: remove profiles. 7491 1727204000.22869: ^ state is: HOST STATE: block=2, task=36, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204000.22871: getting variables 7491 1727204000.22873: in VariableManager get_vars() 7491 1727204000.22929: Calling all_inventory to load vars for managed-node3 7491 1727204000.22932: Calling groups_inventory to load vars for managed-node3 7491 1727204000.22934: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204000.22944: Calling all_plugins_play to load vars for managed-node3 7491 1727204000.22947: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204000.22949: Calling groups_plugins_play to load vars for managed-node3 7491 1727204000.23859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204000.24783: done with get_vars() 7491 1727204000.24800: done getting variables 7491 1727204000.24842: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEARDOWN: remove profiles.] ********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:133 Tuesday 24 September 2024 14:53:20 -0400 (0:00:00.038) 0:00:42.172 ***** 7491 1727204000.24868: entering _queue_task() for managed-node3/debug 7491 1727204000.25080: worker is 1 (out of 1 available) 7491 1727204000.25095: exiting _queue_task() for managed-node3/debug 7491 1727204000.25109: done queuing things up, now waiting for results queue to drain 7491 1727204000.25110: waiting for pending results... 7491 1727204000.25297: running TaskExecutor() for managed-node3/TASK: TEARDOWN: remove profiles. 7491 1727204000.25360: in run() - task 0affcd87-79f5-0a4a-ad01-000000000103 7491 1727204000.25374: variable 'ansible_search_path' from source: unknown 7491 1727204000.25403: calling self._execute() 7491 1727204000.25488: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204000.25493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204000.25500: variable 'omit' from source: magic vars 7491 1727204000.25789: variable 'ansible_distribution_major_version' from source: facts 7491 1727204000.25798: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204000.25805: variable 'omit' from source: magic vars 7491 1727204000.25824: variable 'omit' from source: magic vars 7491 1727204000.25852: variable 'omit' from source: magic vars 7491 1727204000.25883: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727204000.25909: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727204000.25930: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727204000.25943: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204000.25952: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204000.25980: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727204000.25984: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204000.25986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204000.26055: Set connection var ansible_timeout to 10 7491 1727204000.26060: Set connection var ansible_pipelining to False 7491 1727204000.26068: Set connection var ansible_shell_type to sh 7491 1727204000.26074: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727204000.26080: Set connection var ansible_shell_executable to /bin/sh 7491 1727204000.26087: Set connection var ansible_connection to ssh 7491 1727204000.26106: variable 'ansible_shell_executable' from source: unknown 7491 1727204000.26109: variable 'ansible_connection' from source: unknown 7491 1727204000.26111: variable 'ansible_module_compression' from source: unknown 7491 1727204000.26114: variable 'ansible_shell_type' from source: unknown 7491 1727204000.26116: variable 'ansible_shell_executable' from source: unknown 7491 1727204000.26121: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204000.26123: variable 'ansible_pipelining' from source: unknown 7491 1727204000.26125: variable 'ansible_timeout' from source: unknown 7491 1727204000.26127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204000.26236: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727204000.26245: variable 'omit' from source: magic vars 7491 1727204000.26250: starting attempt loop 7491 1727204000.26253: running the handler 7491 1727204000.26289: handler run complete 7491 1727204000.26307: attempt loop complete, returning result 7491 1727204000.26310: _execute() done 7491 1727204000.26313: dumping result to json 7491 1727204000.26316: done dumping result, returning 7491 1727204000.26324: done running TaskExecutor() for managed-node3/TASK: TEARDOWN: remove profiles. [0affcd87-79f5-0a4a-ad01-000000000103] 7491 1727204000.26330: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000103 7491 1727204000.26415: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000103 7491 1727204000.26418: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: ################################################## 7491 1727204000.26470: no more pending results, returning what we have 7491 1727204000.26474: results queue empty 7491 1727204000.26475: checking for any_errors_fatal 7491 1727204000.26481: done checking for any_errors_fatal 7491 1727204000.26482: checking for max_fail_percentage 7491 1727204000.26484: done checking for max_fail_percentage 7491 1727204000.26484: checking to see if all hosts have failed and the running result is not ok 7491 1727204000.26485: done checking to see if all hosts have failed 7491 1727204000.26486: getting the remaining hosts for this loop 7491 1727204000.26488: done getting the remaining hosts for this loop 7491 1727204000.26491: getting the next task for host managed-node3 7491 1727204000.26497: done getting next task for host managed-node3 7491 1727204000.26503: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7491 1727204000.26506: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204000.26528: getting variables 7491 1727204000.26530: in VariableManager get_vars() 7491 1727204000.26573: Calling all_inventory to load vars for managed-node3 7491 1727204000.26576: Calling groups_inventory to load vars for managed-node3 7491 1727204000.26578: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204000.26586: Calling all_plugins_play to load vars for managed-node3 7491 1727204000.26588: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204000.26591: Calling groups_plugins_play to load vars for managed-node3 7491 1727204000.27384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204000.28424: done with get_vars() 7491 1727204000.28440: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:53:20 -0400 (0:00:00.036) 0:00:42.209 ***** 7491 1727204000.28513: entering _queue_task() for managed-node3/include_tasks 7491 1727204000.28740: worker is 1 (out of 1 available) 7491 1727204000.28754: exiting _queue_task() for managed-node3/include_tasks 7491 1727204000.28769: done queuing things up, now waiting for results queue to drain 7491 1727204000.28770: waiting for pending results... 7491 1727204000.28956: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7491 1727204000.29070: in run() - task 0affcd87-79f5-0a4a-ad01-00000000010b 7491 1727204000.29081: variable 'ansible_search_path' from source: unknown 7491 1727204000.29085: variable 'ansible_search_path' from source: unknown 7491 1727204000.29115: calling self._execute() 7491 1727204000.29201: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204000.29204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204000.29214: variable 'omit' from source: magic vars 7491 1727204000.29500: variable 'ansible_distribution_major_version' from source: facts 7491 1727204000.29511: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204000.29516: _execute() done 7491 1727204000.29519: dumping result to json 7491 1727204000.29525: done dumping result, returning 7491 1727204000.29532: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-0a4a-ad01-00000000010b] 7491 1727204000.29539: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000010b 7491 1727204000.29637: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000010b 7491 1727204000.29640: WORKER PROCESS EXITING 7491 1727204000.29695: no more pending results, returning what we have 7491 1727204000.29699: in VariableManager get_vars() 7491 1727204000.29760: Calling all_inventory to load vars for managed-node3 7491 1727204000.29763: Calling groups_inventory to load vars for managed-node3 7491 1727204000.29767: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204000.29776: Calling all_plugins_play to load vars for managed-node3 7491 1727204000.29779: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204000.29781: Calling groups_plugins_play to load vars for managed-node3 7491 1727204000.30587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204000.31511: done with get_vars() 7491 1727204000.31528: variable 'ansible_search_path' from source: unknown 7491 1727204000.31529: variable 'ansible_search_path' from source: unknown 7491 1727204000.31555: we have included files to process 7491 1727204000.31556: generating all_blocks data 7491 1727204000.31558: done generating all_blocks data 7491 1727204000.31562: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7491 1727204000.31562: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7491 1727204000.31566: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7491 1727204000.31961: done processing included file 7491 1727204000.31962: iterating over new_blocks loaded from include file 7491 1727204000.31965: in VariableManager get_vars() 7491 1727204000.31985: done with get_vars() 7491 1727204000.31986: filtering new block on tags 7491 1727204000.31999: done filtering new block on tags 7491 1727204000.32000: in VariableManager get_vars() 7491 1727204000.32019: done with get_vars() 7491 1727204000.32020: filtering new block on tags 7491 1727204000.32036: done filtering new block on tags 7491 1727204000.32038: in VariableManager get_vars() 7491 1727204000.32056: done with get_vars() 7491 1727204000.32057: filtering new block on tags 7491 1727204000.32069: done filtering new block on tags 7491 1727204000.32071: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 7491 1727204000.32075: extending task lists for all hosts with included blocks 7491 1727204000.32537: done extending task lists 7491 1727204000.32538: done processing included files 7491 1727204000.32539: results queue empty 7491 1727204000.32539: checking for any_errors_fatal 7491 1727204000.32542: done checking for any_errors_fatal 7491 1727204000.32542: checking for max_fail_percentage 7491 1727204000.32543: done checking for max_fail_percentage 7491 1727204000.32544: checking to see if all hosts have failed and the running result is not ok 7491 1727204000.32544: done checking to see if all hosts have failed 7491 1727204000.32545: getting the remaining hosts for this loop 7491 1727204000.32546: done getting the remaining hosts for this loop 7491 1727204000.32547: getting the next task for host managed-node3 7491 1727204000.32550: done getting next task for host managed-node3 7491 1727204000.32551: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7491 1727204000.32553: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204000.32561: getting variables 7491 1727204000.32562: in VariableManager get_vars() 7491 1727204000.32577: Calling all_inventory to load vars for managed-node3 7491 1727204000.32579: Calling groups_inventory to load vars for managed-node3 7491 1727204000.32580: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204000.32584: Calling all_plugins_play to load vars for managed-node3 7491 1727204000.32585: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204000.32587: Calling groups_plugins_play to load vars for managed-node3 7491 1727204000.33339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204000.34248: done with get_vars() 7491 1727204000.34263: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:53:20 -0400 (0:00:00.057) 0:00:42.267 ***** 7491 1727204000.34316: entering _queue_task() for managed-node3/setup 7491 1727204000.34545: worker is 1 (out of 1 available) 7491 1727204000.34560: exiting _queue_task() for managed-node3/setup 7491 1727204000.34575: done queuing things up, now waiting for results queue to drain 7491 1727204000.34577: waiting for pending results... 7491 1727204000.34762: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7491 1727204000.34869: in run() - task 0affcd87-79f5-0a4a-ad01-0000000019b6 7491 1727204000.34882: variable 'ansible_search_path' from source: unknown 7491 1727204000.34886: variable 'ansible_search_path' from source: unknown 7491 1727204000.34913: calling self._execute() 7491 1727204000.34991: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204000.34995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204000.35001: variable 'omit' from source: magic vars 7491 1727204000.35273: variable 'ansible_distribution_major_version' from source: facts 7491 1727204000.35284: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204000.35432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727204000.37031: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727204000.37086: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727204000.37114: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727204000.37142: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727204000.37166: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727204000.37226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727204000.37248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727204000.37268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204000.37297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727204000.37308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727204000.37350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727204000.37368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727204000.37387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204000.37413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727204000.37426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727204000.37539: variable '__network_required_facts' from source: role '' defaults 7491 1727204000.37547: variable 'ansible_facts' from source: unknown 7491 1727204000.38036: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 7491 1727204000.38040: when evaluation is False, skipping this task 7491 1727204000.38043: _execute() done 7491 1727204000.38047: dumping result to json 7491 1727204000.38050: done dumping result, returning 7491 1727204000.38054: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-0a4a-ad01-0000000019b6] 7491 1727204000.38061: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000019b6 7491 1727204000.38149: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000019b6 7491 1727204000.38151: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7491 1727204000.38205: no more pending results, returning what we have 7491 1727204000.38208: results queue empty 7491 1727204000.38209: checking for any_errors_fatal 7491 1727204000.38211: done checking for any_errors_fatal 7491 1727204000.38211: checking for max_fail_percentage 7491 1727204000.38213: done checking for max_fail_percentage 7491 1727204000.38214: checking to see if all hosts have failed and the running result is not ok 7491 1727204000.38215: done checking to see if all hosts have failed 7491 1727204000.38216: getting the remaining hosts for this loop 7491 1727204000.38217: done getting the remaining hosts for this loop 7491 1727204000.38221: getting the next task for host managed-node3 7491 1727204000.38230: done getting next task for host managed-node3 7491 1727204000.38234: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 7491 1727204000.38238: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204000.38267: getting variables 7491 1727204000.38269: in VariableManager get_vars() 7491 1727204000.38318: Calling all_inventory to load vars for managed-node3 7491 1727204000.38321: Calling groups_inventory to load vars for managed-node3 7491 1727204000.38323: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204000.38332: Calling all_plugins_play to load vars for managed-node3 7491 1727204000.38335: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204000.38337: Calling groups_plugins_play to load vars for managed-node3 7491 1727204000.39353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204000.40292: done with get_vars() 7491 1727204000.40314: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:53:20 -0400 (0:00:00.060) 0:00:42.327 ***** 7491 1727204000.40400: entering _queue_task() for managed-node3/stat 7491 1727204000.40636: worker is 1 (out of 1 available) 7491 1727204000.40651: exiting _queue_task() for managed-node3/stat 7491 1727204000.40667: done queuing things up, now waiting for results queue to drain 7491 1727204000.40669: waiting for pending results... 7491 1727204000.40870: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 7491 1727204000.41156: in run() - task 0affcd87-79f5-0a4a-ad01-0000000019b8 7491 1727204000.41179: variable 'ansible_search_path' from source: unknown 7491 1727204000.41186: variable 'ansible_search_path' from source: unknown 7491 1727204000.41237: calling self._execute() 7491 1727204000.41347: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204000.41357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204000.41373: variable 'omit' from source: magic vars 7491 1727204000.41779: variable 'ansible_distribution_major_version' from source: facts 7491 1727204000.41796: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204000.41985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727204000.42284: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727204000.42341: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727204000.42380: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727204000.42431: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727204000.42532: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727204000.42562: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727204000.42595: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204000.42636: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727204000.42742: variable '__network_is_ostree' from source: set_fact 7491 1727204000.42753: Evaluated conditional (not __network_is_ostree is defined): False 7491 1727204000.42761: when evaluation is False, skipping this task 7491 1727204000.42770: _execute() done 7491 1727204000.42776: dumping result to json 7491 1727204000.42783: done dumping result, returning 7491 1727204000.42794: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-0a4a-ad01-0000000019b8] 7491 1727204000.42806: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000019b8 7491 1727204000.42925: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000019b8 7491 1727204000.42939: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7491 1727204000.42995: no more pending results, returning what we have 7491 1727204000.42999: results queue empty 7491 1727204000.43000: checking for any_errors_fatal 7491 1727204000.43009: done checking for any_errors_fatal 7491 1727204000.43009: checking for max_fail_percentage 7491 1727204000.43011: done checking for max_fail_percentage 7491 1727204000.43012: checking to see if all hosts have failed and the running result is not ok 7491 1727204000.43014: done checking to see if all hosts have failed 7491 1727204000.43015: getting the remaining hosts for this loop 7491 1727204000.43017: done getting the remaining hosts for this loop 7491 1727204000.43020: getting the next task for host managed-node3 7491 1727204000.43027: done getting next task for host managed-node3 7491 1727204000.43032: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7491 1727204000.43038: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204000.43063: getting variables 7491 1727204000.43067: in VariableManager get_vars() 7491 1727204000.43121: Calling all_inventory to load vars for managed-node3 7491 1727204000.43125: Calling groups_inventory to load vars for managed-node3 7491 1727204000.43127: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204000.43138: Calling all_plugins_play to load vars for managed-node3 7491 1727204000.43141: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204000.43144: Calling groups_plugins_play to load vars for managed-node3 7491 1727204000.44936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204000.45874: done with get_vars() 7491 1727204000.45897: done getting variables 7491 1727204000.45944: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:53:20 -0400 (0:00:00.055) 0:00:42.383 ***** 7491 1727204000.45975: entering _queue_task() for managed-node3/set_fact 7491 1727204000.46215: worker is 1 (out of 1 available) 7491 1727204000.46231: exiting _queue_task() for managed-node3/set_fact 7491 1727204000.46245: done queuing things up, now waiting for results queue to drain 7491 1727204000.46246: waiting for pending results... 7491 1727204000.46459: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7491 1727204000.46678: in run() - task 0affcd87-79f5-0a4a-ad01-0000000019b9 7491 1727204000.46699: variable 'ansible_search_path' from source: unknown 7491 1727204000.46707: variable 'ansible_search_path' from source: unknown 7491 1727204000.46745: calling self._execute() 7491 1727204000.46850: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204000.46862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204000.46881: variable 'omit' from source: magic vars 7491 1727204000.47256: variable 'ansible_distribution_major_version' from source: facts 7491 1727204000.47276: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204000.47450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727204000.47742: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727204000.47794: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727204000.47832: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727204000.47880: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727204000.47970: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727204000.48002: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727204000.48034: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204000.48072: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727204000.48180: variable '__network_is_ostree' from source: set_fact 7491 1727204000.48191: Evaluated conditional (not __network_is_ostree is defined): False 7491 1727204000.48198: when evaluation is False, skipping this task 7491 1727204000.48204: _execute() done 7491 1727204000.48210: dumping result to json 7491 1727204000.48217: done dumping result, returning 7491 1727204000.48228: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-0a4a-ad01-0000000019b9] 7491 1727204000.48238: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000019b9 skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7491 1727204000.48382: no more pending results, returning what we have 7491 1727204000.48387: results queue empty 7491 1727204000.48389: checking for any_errors_fatal 7491 1727204000.48395: done checking for any_errors_fatal 7491 1727204000.48395: checking for max_fail_percentage 7491 1727204000.48398: done checking for max_fail_percentage 7491 1727204000.48399: checking to see if all hosts have failed and the running result is not ok 7491 1727204000.48400: done checking to see if all hosts have failed 7491 1727204000.48401: getting the remaining hosts for this loop 7491 1727204000.48403: done getting the remaining hosts for this loop 7491 1727204000.48406: getting the next task for host managed-node3 7491 1727204000.48415: done getting next task for host managed-node3 7491 1727204000.48421: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 7491 1727204000.48425: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204000.48452: getting variables 7491 1727204000.48454: in VariableManager get_vars() 7491 1727204000.48517: Calling all_inventory to load vars for managed-node3 7491 1727204000.48522: Calling groups_inventory to load vars for managed-node3 7491 1727204000.48524: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204000.48535: Calling all_plugins_play to load vars for managed-node3 7491 1727204000.48538: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204000.48541: Calling groups_plugins_play to load vars for managed-node3 7491 1727204000.49061: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000019b9 7491 1727204000.49065: WORKER PROCESS EXITING 7491 1727204000.49994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204000.52235: done with get_vars() 7491 1727204000.52304: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:53:20 -0400 (0:00:00.064) 0:00:42.448 ***** 7491 1727204000.52415: entering _queue_task() for managed-node3/service_facts 7491 1727204000.52762: worker is 1 (out of 1 available) 7491 1727204000.52776: exiting _queue_task() for managed-node3/service_facts 7491 1727204000.52789: done queuing things up, now waiting for results queue to drain 7491 1727204000.52790: waiting for pending results... 7491 1727204000.53097: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 7491 1727204000.53294: in run() - task 0affcd87-79f5-0a4a-ad01-0000000019bb 7491 1727204000.53314: variable 'ansible_search_path' from source: unknown 7491 1727204000.53329: variable 'ansible_search_path' from source: unknown 7491 1727204000.53373: calling self._execute() 7491 1727204000.53485: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204000.53495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204000.53508: variable 'omit' from source: magic vars 7491 1727204000.53927: variable 'ansible_distribution_major_version' from source: facts 7491 1727204000.53944: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204000.53955: variable 'omit' from source: magic vars 7491 1727204000.54053: variable 'omit' from source: magic vars 7491 1727204000.54104: variable 'omit' from source: magic vars 7491 1727204000.54154: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727204000.54205: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727204000.54233: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727204000.54255: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204000.54272: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204000.54313: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727204000.54324: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204000.54332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204000.54449: Set connection var ansible_timeout to 10 7491 1727204000.54461: Set connection var ansible_pipelining to False 7491 1727204000.54472: Set connection var ansible_shell_type to sh 7491 1727204000.54482: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727204000.54492: Set connection var ansible_shell_executable to /bin/sh 7491 1727204000.54500: Set connection var ansible_connection to ssh 7491 1727204000.54541: variable 'ansible_shell_executable' from source: unknown 7491 1727204000.54549: variable 'ansible_connection' from source: unknown 7491 1727204000.54555: variable 'ansible_module_compression' from source: unknown 7491 1727204000.54561: variable 'ansible_shell_type' from source: unknown 7491 1727204000.54569: variable 'ansible_shell_executable' from source: unknown 7491 1727204000.54575: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204000.54582: variable 'ansible_pipelining' from source: unknown 7491 1727204000.54587: variable 'ansible_timeout' from source: unknown 7491 1727204000.54594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204000.54818: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7491 1727204000.54833: variable 'omit' from source: magic vars 7491 1727204000.54848: starting attempt loop 7491 1727204000.54856: running the handler 7491 1727204000.54876: _low_level_execute_command(): starting 7491 1727204000.54888: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727204000.55678: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727204000.55693: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204000.55710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204000.55735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204000.55782: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204000.55794: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727204000.55807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204000.55829: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727204000.55844: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727204000.55855: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727204000.55869: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204000.55883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204000.55899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204000.55910: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204000.55921: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727204000.55940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204000.56023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204000.56050: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204000.56073: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204000.56174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204000.57776: stdout chunk (state=3): >>>/root <<< 7491 1727204000.57973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204000.57976: stdout chunk (state=3): >>><<< 7491 1727204000.57979: stderr chunk (state=3): >>><<< 7491 1727204000.58097: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204000.58101: _low_level_execute_command(): starting 7491 1727204000.58104: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204000.5800023-9415-7514953887346 `" && echo ansible-tmp-1727204000.5800023-9415-7514953887346="` echo /root/.ansible/tmp/ansible-tmp-1727204000.5800023-9415-7514953887346 `" ) && sleep 0' 7491 1727204000.58669: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727204000.58678: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204000.58689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204000.58702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204000.58743: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204000.58752: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727204000.58758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204000.58773: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727204000.58781: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727204000.58788: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727204000.58796: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204000.58806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204000.58820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204000.58833: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204000.58840: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727204000.58852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204000.58927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204000.58941: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204000.58951: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204000.59019: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204000.60844: stdout chunk (state=3): >>>ansible-tmp-1727204000.5800023-9415-7514953887346=/root/.ansible/tmp/ansible-tmp-1727204000.5800023-9415-7514953887346 <<< 7491 1727204000.61016: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204000.61023: stderr chunk (state=3): >>><<< 7491 1727204000.61028: stdout chunk (state=3): >>><<< 7491 1727204000.61049: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204000.5800023-9415-7514953887346=/root/.ansible/tmp/ansible-tmp-1727204000.5800023-9415-7514953887346 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204000.61100: variable 'ansible_module_compression' from source: unknown 7491 1727204000.61151: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 7491 1727204000.61197: variable 'ansible_facts' from source: unknown 7491 1727204000.61271: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204000.5800023-9415-7514953887346/AnsiballZ_service_facts.py 7491 1727204000.62602: Sending initial data 7491 1727204000.62605: Sent initial data (158 bytes) 7491 1727204000.64739: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204000.64745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204000.64791: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204000.64795: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204000.64873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204000.64880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204000.65007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204000.65013: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204000.65053: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204000.65159: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204000.66801: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727204000.66835: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727204000.66878: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpzrqrhtyl /root/.ansible/tmp/ansible-tmp-1727204000.5800023-9415-7514953887346/AnsiballZ_service_facts.py <<< 7491 1727204000.66913: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727204000.68273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204000.68348: stderr chunk (state=3): >>><<< 7491 1727204000.68352: stdout chunk (state=3): >>><<< 7491 1727204000.68372: done transferring module to remote 7491 1727204000.68382: _low_level_execute_command(): starting 7491 1727204000.68388: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204000.5800023-9415-7514953887346/ /root/.ansible/tmp/ansible-tmp-1727204000.5800023-9415-7514953887346/AnsiballZ_service_facts.py && sleep 0' 7491 1727204000.69402: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727204000.69410: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204000.69424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204000.69435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204000.69480: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204000.69487: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727204000.69497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204000.69511: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727204000.69523: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727204000.69526: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727204000.69535: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204000.69544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204000.69555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204000.69789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204000.69792: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727204000.69794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204000.69797: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204000.69799: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204000.69801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204000.69803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204000.71518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204000.71522: stdout chunk (state=3): >>><<< 7491 1727204000.71532: stderr chunk (state=3): >>><<< 7491 1727204000.71548: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204000.71551: _low_level_execute_command(): starting 7491 1727204000.71556: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204000.5800023-9415-7514953887346/AnsiballZ_service_facts.py && sleep 0' 7491 1727204000.72849: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727204000.73484: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204000.73494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204000.73507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204000.73548: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204000.73554: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727204000.73565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204000.73578: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727204000.73585: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727204000.73591: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727204000.73599: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204000.73607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204000.73619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204000.73629: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204000.73636: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727204000.73643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204000.73714: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204000.73735: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204000.73747: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204000.73829: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204001.98145: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "stat<<< 7491 1727204001.98159: stdout chunk (state=3): >>>e": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap<<< 7491 1727204001.98183: stdout chunk (state=3): >>>.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 7491 1727204001.99469: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727204001.99474: stdout chunk (state=3): >>><<< 7491 1727204001.99476: stderr chunk (state=3): >>><<< 7491 1727204001.99682: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727204002.00266: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204000.5800023-9415-7514953887346/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727204002.00285: _low_level_execute_command(): starting 7491 1727204002.00296: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204000.5800023-9415-7514953887346/ > /dev/null 2>&1 && sleep 0' 7491 1727204002.02173: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727204002.02307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204002.02324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204002.02335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204002.02377: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204002.02411: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727204002.02422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204002.02437: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727204002.02445: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727204002.02490: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727204002.02498: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204002.02510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204002.02526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204002.02539: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204002.02546: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727204002.02555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204002.02629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204002.02768: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204002.02778: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204002.02873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204002.04716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204002.04724: stdout chunk (state=3): >>><<< 7491 1727204002.04726: stderr chunk (state=3): >>><<< 7491 1727204002.04747: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204002.04753: handler run complete 7491 1727204002.04929: variable 'ansible_facts' from source: unknown 7491 1727204002.05071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204002.05654: variable 'ansible_facts' from source: unknown 7491 1727204002.05770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204002.05940: attempt loop complete, returning result 7491 1727204002.05944: _execute() done 7491 1727204002.05946: dumping result to json 7491 1727204002.06000: done dumping result, returning 7491 1727204002.06010: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-0a4a-ad01-0000000019bb] 7491 1727204002.06017: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000019bb 7491 1727204002.06745: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000019bb 7491 1727204002.06748: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7491 1727204002.06835: no more pending results, returning what we have 7491 1727204002.06838: results queue empty 7491 1727204002.06839: checking for any_errors_fatal 7491 1727204002.06845: done checking for any_errors_fatal 7491 1727204002.06846: checking for max_fail_percentage 7491 1727204002.06847: done checking for max_fail_percentage 7491 1727204002.06848: checking to see if all hosts have failed and the running result is not ok 7491 1727204002.06850: done checking to see if all hosts have failed 7491 1727204002.06850: getting the remaining hosts for this loop 7491 1727204002.06852: done getting the remaining hosts for this loop 7491 1727204002.06855: getting the next task for host managed-node3 7491 1727204002.06863: done getting next task for host managed-node3 7491 1727204002.06867: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 7491 1727204002.06872: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204002.06885: getting variables 7491 1727204002.06887: in VariableManager get_vars() 7491 1727204002.06931: Calling all_inventory to load vars for managed-node3 7491 1727204002.06934: Calling groups_inventory to load vars for managed-node3 7491 1727204002.06937: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204002.06946: Calling all_plugins_play to load vars for managed-node3 7491 1727204002.06949: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204002.06957: Calling groups_plugins_play to load vars for managed-node3 7491 1727204002.08699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204002.11928: done with get_vars() 7491 1727204002.11953: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:53:22 -0400 (0:00:01.596) 0:00:44.044 ***** 7491 1727204002.12079: entering _queue_task() for managed-node3/package_facts 7491 1727204002.12596: worker is 1 (out of 1 available) 7491 1727204002.12609: exiting _queue_task() for managed-node3/package_facts 7491 1727204002.12625: done queuing things up, now waiting for results queue to drain 7491 1727204002.12626: waiting for pending results... 7491 1727204002.12994: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 7491 1727204002.13193: in run() - task 0affcd87-79f5-0a4a-ad01-0000000019bc 7491 1727204002.13243: variable 'ansible_search_path' from source: unknown 7491 1727204002.13252: variable 'ansible_search_path' from source: unknown 7491 1727204002.13305: calling self._execute() 7491 1727204002.13437: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204002.13468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204002.13485: variable 'omit' from source: magic vars 7491 1727204002.14008: variable 'ansible_distribution_major_version' from source: facts 7491 1727204002.14029: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204002.14041: variable 'omit' from source: magic vars 7491 1727204002.14137: variable 'omit' from source: magic vars 7491 1727204002.14183: variable 'omit' from source: magic vars 7491 1727204002.14265: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727204002.14322: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727204002.14355: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727204002.14394: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204002.14426: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204002.14495: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727204002.14504: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204002.14512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204002.14660: Set connection var ansible_timeout to 10 7491 1727204002.14678: Set connection var ansible_pipelining to False 7491 1727204002.14690: Set connection var ansible_shell_type to sh 7491 1727204002.14699: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727204002.14713: Set connection var ansible_shell_executable to /bin/sh 7491 1727204002.14724: Set connection var ansible_connection to ssh 7491 1727204002.14751: variable 'ansible_shell_executable' from source: unknown 7491 1727204002.14758: variable 'ansible_connection' from source: unknown 7491 1727204002.14766: variable 'ansible_module_compression' from source: unknown 7491 1727204002.14773: variable 'ansible_shell_type' from source: unknown 7491 1727204002.14784: variable 'ansible_shell_executable' from source: unknown 7491 1727204002.14794: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204002.14803: variable 'ansible_pipelining' from source: unknown 7491 1727204002.14811: variable 'ansible_timeout' from source: unknown 7491 1727204002.14822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204002.15054: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7491 1727204002.15072: variable 'omit' from source: magic vars 7491 1727204002.15080: starting attempt loop 7491 1727204002.15086: running the handler 7491 1727204002.15106: _low_level_execute_command(): starting 7491 1727204002.15126: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727204002.16233: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204002.16238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204002.16255: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204002.16398: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204002.16438: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204002.16529: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204002.18069: stdout chunk (state=3): >>>/root <<< 7491 1727204002.18288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204002.18310: stderr chunk (state=3): >>><<< 7491 1727204002.18322: stdout chunk (state=3): >>><<< 7491 1727204002.18370: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204002.18375: _low_level_execute_command(): starting 7491 1727204002.18377: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204002.1833446-9464-131343419593223 `" && echo ansible-tmp-1727204002.1833446-9464-131343419593223="` echo /root/.ansible/tmp/ansible-tmp-1727204002.1833446-9464-131343419593223 `" ) && sleep 0' 7491 1727204002.19373: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727204002.19383: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204002.19399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204002.19414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204002.19455: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204002.19462: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727204002.19720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204002.19862: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727204002.20314: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727204002.20321: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727204002.20324: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204002.20326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204002.20329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204002.20331: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204002.20333: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727204002.20335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204002.20337: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204002.20339: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204002.20341: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204002.20343: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204002.21581: stdout chunk (state=3): >>>ansible-tmp-1727204002.1833446-9464-131343419593223=/root/.ansible/tmp/ansible-tmp-1727204002.1833446-9464-131343419593223 <<< 7491 1727204002.21762: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204002.21768: stdout chunk (state=3): >>><<< 7491 1727204002.21775: stderr chunk (state=3): >>><<< 7491 1727204002.21823: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204002.1833446-9464-131343419593223=/root/.ansible/tmp/ansible-tmp-1727204002.1833446-9464-131343419593223 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204002.21867: variable 'ansible_module_compression' from source: unknown 7491 1727204002.21935: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 7491 1727204002.21993: variable 'ansible_facts' from source: unknown 7491 1727204002.22133: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204002.1833446-9464-131343419593223/AnsiballZ_package_facts.py 7491 1727204002.22257: Sending initial data 7491 1727204002.22260: Sent initial data (160 bytes) 7491 1727204002.23275: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204002.23279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204002.23325: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204002.23329: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 7491 1727204002.23342: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204002.23345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204002.23361: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727204002.23368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204002.23443: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204002.23447: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204002.23460: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204002.23530: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204002.25184: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727204002.25221: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727204002.25257: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmp8kkyr12v /root/.ansible/tmp/ansible-tmp-1727204002.1833446-9464-131343419593223/AnsiballZ_package_facts.py <<< 7491 1727204002.25295: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727204002.27524: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204002.27529: stdout chunk (state=3): >>><<< 7491 1727204002.27531: stderr chunk (state=3): >>><<< 7491 1727204002.27533: done transferring module to remote 7491 1727204002.27535: _low_level_execute_command(): starting 7491 1727204002.27537: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204002.1833446-9464-131343419593223/ /root/.ansible/tmp/ansible-tmp-1727204002.1833446-9464-131343419593223/AnsiballZ_package_facts.py && sleep 0' 7491 1727204002.28227: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727204002.28239: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204002.28249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204002.28262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204002.28311: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727204002.28315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204002.28479: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204002.28495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204002.30151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204002.30222: stderr chunk (state=3): >>><<< 7491 1727204002.30225: stdout chunk (state=3): >>><<< 7491 1727204002.30270: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204002.30274: _low_level_execute_command(): starting 7491 1727204002.30276: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204002.1833446-9464-131343419593223/AnsiballZ_package_facts.py && sleep 0' 7491 1727204002.30698: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204002.30702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204002.30740: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204002.30758: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204002.30761: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204002.30821: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204002.30825: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204002.30827: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204002.30884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204002.76479: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{<<< 7491 1727204002.76496: stdout chunk (state=3): >>>"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": <<< 7491 1727204002.76501: stdout chunk (state=3): >>>"53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4"<<< 7491 1727204002.76506: stdout chunk (state=3): >>>, "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x<<< 7491 1727204002.76510: stdout chunk (state=3): >>>86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "rel<<< 7491 1727204002.76574: stdout chunk (state=3): >>>ease": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "r<<< 7491 1727204002.76588: stdout chunk (state=3): >>>elease": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles"<<< 7491 1727204002.76593: stdout chunk (state=3): >>>: [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pe<<< 7491 1727204002.76598: stdout chunk (state=3): >>>rl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}],<<< 7491 1727204002.76628: stdout chunk (state=3): >>> "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "s<<< 7491 1727204002.76634: stdout chunk (state=3): >>>ource": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el<<< 7491 1727204002.76647: stdout chunk (state=3): >>>9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 7491 1727204002.78128: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727204002.78191: stderr chunk (state=3): >>><<< 7491 1727204002.78194: stdout chunk (state=3): >>><<< 7491 1727204002.78280: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727204002.81100: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204002.1833446-9464-131343419593223/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727204002.81134: _low_level_execute_command(): starting 7491 1727204002.81146: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204002.1833446-9464-131343419593223/ > /dev/null 2>&1 && sleep 0' 7491 1727204002.81836: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727204002.81871: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204002.81875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204002.81890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204002.81925: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204002.81947: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7491 1727204002.81951: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727204002.81953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204002.82001: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204002.82012: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204002.82072: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204002.84775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204002.84781: stderr chunk (state=3): >>><<< 7491 1727204002.84783: stdout chunk (state=3): >>><<< 7491 1727204002.84786: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204002.84788: handler run complete 7491 1727204002.84961: variable 'ansible_facts' from source: unknown 7491 1727204002.85461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204002.87545: variable 'ansible_facts' from source: unknown 7491 1727204002.87912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204002.88374: attempt loop complete, returning result 7491 1727204002.88389: _execute() done 7491 1727204002.88397: dumping result to json 7491 1727204002.88536: done dumping result, returning 7491 1727204002.88547: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-0a4a-ad01-0000000019bc] 7491 1727204002.88554: sending task result for task 0affcd87-79f5-0a4a-ad01-0000000019bc ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7491 1727204002.90587: no more pending results, returning what we have 7491 1727204002.90591: results queue empty 7491 1727204002.90592: checking for any_errors_fatal 7491 1727204002.90600: done checking for any_errors_fatal 7491 1727204002.90601: checking for max_fail_percentage 7491 1727204002.90603: done checking for max_fail_percentage 7491 1727204002.90604: checking to see if all hosts have failed and the running result is not ok 7491 1727204002.90605: done checking to see if all hosts have failed 7491 1727204002.90606: getting the remaining hosts for this loop 7491 1727204002.90608: done getting the remaining hosts for this loop 7491 1727204002.90611: getting the next task for host managed-node3 7491 1727204002.90623: done getting next task for host managed-node3 7491 1727204002.90628: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 7491 1727204002.90631: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204002.90645: getting variables 7491 1727204002.90647: in VariableManager get_vars() 7491 1727204002.90706: Calling all_inventory to load vars for managed-node3 7491 1727204002.90709: Calling groups_inventory to load vars for managed-node3 7491 1727204002.90711: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204002.90753: Calling all_plugins_play to load vars for managed-node3 7491 1727204002.90756: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204002.90760: Calling groups_plugins_play to load vars for managed-node3 7491 1727204002.91449: done sending task result for task 0affcd87-79f5-0a4a-ad01-0000000019bc 7491 1727204002.91453: WORKER PROCESS EXITING 7491 1727204002.91601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204002.92602: done with get_vars() 7491 1727204002.92623: done getting variables 7491 1727204002.92669: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:53:22 -0400 (0:00:00.806) 0:00:44.850 ***** 7491 1727204002.92700: entering _queue_task() for managed-node3/debug 7491 1727204002.92943: worker is 1 (out of 1 available) 7491 1727204002.92956: exiting _queue_task() for managed-node3/debug 7491 1727204002.92971: done queuing things up, now waiting for results queue to drain 7491 1727204002.92972: waiting for pending results... 7491 1727204002.93165: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 7491 1727204002.93269: in run() - task 0affcd87-79f5-0a4a-ad01-00000000010c 7491 1727204002.93285: variable 'ansible_search_path' from source: unknown 7491 1727204002.93289: variable 'ansible_search_path' from source: unknown 7491 1727204002.93318: calling self._execute() 7491 1727204002.93396: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204002.93401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204002.93410: variable 'omit' from source: magic vars 7491 1727204002.93689: variable 'ansible_distribution_major_version' from source: facts 7491 1727204002.93699: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204002.93706: variable 'omit' from source: magic vars 7491 1727204002.93748: variable 'omit' from source: magic vars 7491 1727204002.93816: variable 'network_provider' from source: set_fact 7491 1727204002.93834: variable 'omit' from source: magic vars 7491 1727204002.93868: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727204002.93895: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727204002.93912: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727204002.93926: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204002.93937: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204002.93961: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727204002.93966: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204002.93969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204002.94036: Set connection var ansible_timeout to 10 7491 1727204002.94045: Set connection var ansible_pipelining to False 7491 1727204002.94048: Set connection var ansible_shell_type to sh 7491 1727204002.94053: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727204002.94059: Set connection var ansible_shell_executable to /bin/sh 7491 1727204002.94065: Set connection var ansible_connection to ssh 7491 1727204002.94085: variable 'ansible_shell_executable' from source: unknown 7491 1727204002.94088: variable 'ansible_connection' from source: unknown 7491 1727204002.94090: variable 'ansible_module_compression' from source: unknown 7491 1727204002.94092: variable 'ansible_shell_type' from source: unknown 7491 1727204002.94094: variable 'ansible_shell_executable' from source: unknown 7491 1727204002.94096: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204002.94101: variable 'ansible_pipelining' from source: unknown 7491 1727204002.94103: variable 'ansible_timeout' from source: unknown 7491 1727204002.94107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204002.94210: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727204002.94220: variable 'omit' from source: magic vars 7491 1727204002.94224: starting attempt loop 7491 1727204002.94226: running the handler 7491 1727204002.94260: handler run complete 7491 1727204002.94273: attempt loop complete, returning result 7491 1727204002.94276: _execute() done 7491 1727204002.94278: dumping result to json 7491 1727204002.94281: done dumping result, returning 7491 1727204002.94288: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-0a4a-ad01-00000000010c] 7491 1727204002.94294: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000010c 7491 1727204002.94382: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000010c 7491 1727204002.94385: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: Using network provider: nm 7491 1727204002.94456: no more pending results, returning what we have 7491 1727204002.94460: results queue empty 7491 1727204002.94461: checking for any_errors_fatal 7491 1727204002.94474: done checking for any_errors_fatal 7491 1727204002.94478: checking for max_fail_percentage 7491 1727204002.94480: done checking for max_fail_percentage 7491 1727204002.94480: checking to see if all hosts have failed and the running result is not ok 7491 1727204002.94481: done checking to see if all hosts have failed 7491 1727204002.94482: getting the remaining hosts for this loop 7491 1727204002.94484: done getting the remaining hosts for this loop 7491 1727204002.94487: getting the next task for host managed-node3 7491 1727204002.94493: done getting next task for host managed-node3 7491 1727204002.94496: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7491 1727204002.94499: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204002.94510: getting variables 7491 1727204002.94512: in VariableManager get_vars() 7491 1727204002.94557: Calling all_inventory to load vars for managed-node3 7491 1727204002.94559: Calling groups_inventory to load vars for managed-node3 7491 1727204002.94561: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204002.94572: Calling all_plugins_play to load vars for managed-node3 7491 1727204002.94574: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204002.94581: Calling groups_plugins_play to load vars for managed-node3 7491 1727204002.95388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204002.96313: done with get_vars() 7491 1727204002.96334: done getting variables 7491 1727204002.96379: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:53:22 -0400 (0:00:00.037) 0:00:44.887 ***** 7491 1727204002.96406: entering _queue_task() for managed-node3/fail 7491 1727204002.96639: worker is 1 (out of 1 available) 7491 1727204002.96653: exiting _queue_task() for managed-node3/fail 7491 1727204002.96667: done queuing things up, now waiting for results queue to drain 7491 1727204002.96668: waiting for pending results... 7491 1727204002.96857: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7491 1727204002.96950: in run() - task 0affcd87-79f5-0a4a-ad01-00000000010d 7491 1727204002.96961: variable 'ansible_search_path' from source: unknown 7491 1727204002.96967: variable 'ansible_search_path' from source: unknown 7491 1727204002.96997: calling self._execute() 7491 1727204002.97075: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204002.97080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204002.97090: variable 'omit' from source: magic vars 7491 1727204002.97370: variable 'ansible_distribution_major_version' from source: facts 7491 1727204002.97380: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204002.97468: variable 'network_state' from source: role '' defaults 7491 1727204002.97476: Evaluated conditional (network_state != {}): False 7491 1727204002.97479: when evaluation is False, skipping this task 7491 1727204002.97482: _execute() done 7491 1727204002.97486: dumping result to json 7491 1727204002.97488: done dumping result, returning 7491 1727204002.97495: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-0a4a-ad01-00000000010d] 7491 1727204002.97501: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000010d 7491 1727204002.97597: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000010d 7491 1727204002.97600: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7491 1727204002.97653: no more pending results, returning what we have 7491 1727204002.97657: results queue empty 7491 1727204002.97658: checking for any_errors_fatal 7491 1727204002.97669: done checking for any_errors_fatal 7491 1727204002.97669: checking for max_fail_percentage 7491 1727204002.97671: done checking for max_fail_percentage 7491 1727204002.97672: checking to see if all hosts have failed and the running result is not ok 7491 1727204002.97673: done checking to see if all hosts have failed 7491 1727204002.97674: getting the remaining hosts for this loop 7491 1727204002.97680: done getting the remaining hosts for this loop 7491 1727204002.97684: getting the next task for host managed-node3 7491 1727204002.97690: done getting next task for host managed-node3 7491 1727204002.97694: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7491 1727204002.97697: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204002.97716: getting variables 7491 1727204002.97718: in VariableManager get_vars() 7491 1727204002.97766: Calling all_inventory to load vars for managed-node3 7491 1727204002.97769: Calling groups_inventory to load vars for managed-node3 7491 1727204002.97771: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204002.97780: Calling all_plugins_play to load vars for managed-node3 7491 1727204002.97782: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204002.97787: Calling groups_plugins_play to load vars for managed-node3 7491 1727204002.98713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204003.04946: done with get_vars() 7491 1727204003.04974: done getting variables 7491 1727204003.05012: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:53:23 -0400 (0:00:00.086) 0:00:44.974 ***** 7491 1727204003.05033: entering _queue_task() for managed-node3/fail 7491 1727204003.05277: worker is 1 (out of 1 available) 7491 1727204003.05289: exiting _queue_task() for managed-node3/fail 7491 1727204003.05301: done queuing things up, now waiting for results queue to drain 7491 1727204003.05303: waiting for pending results... 7491 1727204003.05512: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7491 1727204003.05704: in run() - task 0affcd87-79f5-0a4a-ad01-00000000010e 7491 1727204003.05710: variable 'ansible_search_path' from source: unknown 7491 1727204003.05715: variable 'ansible_search_path' from source: unknown 7491 1727204003.05736: calling self._execute() 7491 1727204003.05820: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204003.05825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204003.05835: variable 'omit' from source: magic vars 7491 1727204003.06120: variable 'ansible_distribution_major_version' from source: facts 7491 1727204003.06128: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204003.06216: variable 'network_state' from source: role '' defaults 7491 1727204003.06226: Evaluated conditional (network_state != {}): False 7491 1727204003.06229: when evaluation is False, skipping this task 7491 1727204003.06232: _execute() done 7491 1727204003.06235: dumping result to json 7491 1727204003.06237: done dumping result, returning 7491 1727204003.06245: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-0a4a-ad01-00000000010e] 7491 1727204003.06250: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000010e 7491 1727204003.06351: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000010e 7491 1727204003.06353: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7491 1727204003.06405: no more pending results, returning what we have 7491 1727204003.06408: results queue empty 7491 1727204003.06409: checking for any_errors_fatal 7491 1727204003.06418: done checking for any_errors_fatal 7491 1727204003.06421: checking for max_fail_percentage 7491 1727204003.06423: done checking for max_fail_percentage 7491 1727204003.06424: checking to see if all hosts have failed and the running result is not ok 7491 1727204003.06425: done checking to see if all hosts have failed 7491 1727204003.06425: getting the remaining hosts for this loop 7491 1727204003.06428: done getting the remaining hosts for this loop 7491 1727204003.06431: getting the next task for host managed-node3 7491 1727204003.06436: done getting next task for host managed-node3 7491 1727204003.06440: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7491 1727204003.06442: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204003.06469: getting variables 7491 1727204003.06471: in VariableManager get_vars() 7491 1727204003.06521: Calling all_inventory to load vars for managed-node3 7491 1727204003.06524: Calling groups_inventory to load vars for managed-node3 7491 1727204003.06526: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204003.06535: Calling all_plugins_play to load vars for managed-node3 7491 1727204003.06537: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204003.06539: Calling groups_plugins_play to load vars for managed-node3 7491 1727204003.07620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204003.09059: done with get_vars() 7491 1727204003.09082: done getting variables 7491 1727204003.09130: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:53:23 -0400 (0:00:00.041) 0:00:45.015 ***** 7491 1727204003.09159: entering _queue_task() for managed-node3/fail 7491 1727204003.09401: worker is 1 (out of 1 available) 7491 1727204003.09415: exiting _queue_task() for managed-node3/fail 7491 1727204003.09429: done queuing things up, now waiting for results queue to drain 7491 1727204003.09430: waiting for pending results... 7491 1727204003.09633: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7491 1727204003.09739: in run() - task 0affcd87-79f5-0a4a-ad01-00000000010f 7491 1727204003.09752: variable 'ansible_search_path' from source: unknown 7491 1727204003.09758: variable 'ansible_search_path' from source: unknown 7491 1727204003.09788: calling self._execute() 7491 1727204003.09873: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204003.09877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204003.09885: variable 'omit' from source: magic vars 7491 1727204003.10173: variable 'ansible_distribution_major_version' from source: facts 7491 1727204003.10184: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204003.10368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727204003.13001: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727204003.13480: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727204003.13526: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727204003.13574: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727204003.13606: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727204003.13696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727204003.13731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727204003.13761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204003.13813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727204003.13835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727204003.13943: variable 'ansible_distribution_major_version' from source: facts 7491 1727204003.13968: Evaluated conditional (ansible_distribution_major_version | int > 9): False 7491 1727204003.13976: when evaluation is False, skipping this task 7491 1727204003.13982: _execute() done 7491 1727204003.13988: dumping result to json 7491 1727204003.14002: done dumping result, returning 7491 1727204003.14013: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-0a4a-ad01-00000000010f] 7491 1727204003.14026: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000010f skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 7491 1727204003.14178: no more pending results, returning what we have 7491 1727204003.14182: results queue empty 7491 1727204003.14183: checking for any_errors_fatal 7491 1727204003.14190: done checking for any_errors_fatal 7491 1727204003.14190: checking for max_fail_percentage 7491 1727204003.14192: done checking for max_fail_percentage 7491 1727204003.14193: checking to see if all hosts have failed and the running result is not ok 7491 1727204003.14194: done checking to see if all hosts have failed 7491 1727204003.14195: getting the remaining hosts for this loop 7491 1727204003.14197: done getting the remaining hosts for this loop 7491 1727204003.14211: getting the next task for host managed-node3 7491 1727204003.14218: done getting next task for host managed-node3 7491 1727204003.14224: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7491 1727204003.14227: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204003.14255: getting variables 7491 1727204003.14257: in VariableManager get_vars() 7491 1727204003.14311: Calling all_inventory to load vars for managed-node3 7491 1727204003.14314: Calling groups_inventory to load vars for managed-node3 7491 1727204003.14317: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204003.14331: Calling all_plugins_play to load vars for managed-node3 7491 1727204003.14334: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204003.14336: Calling groups_plugins_play to load vars for managed-node3 7491 1727204003.15307: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000010f 7491 1727204003.15311: WORKER PROCESS EXITING 7491 1727204003.16475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204003.18643: done with get_vars() 7491 1727204003.18675: done getting variables 7491 1727204003.18745: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:53:23 -0400 (0:00:00.096) 0:00:45.111 ***** 7491 1727204003.18781: entering _queue_task() for managed-node3/dnf 7491 1727204003.19134: worker is 1 (out of 1 available) 7491 1727204003.19151: exiting _queue_task() for managed-node3/dnf 7491 1727204003.19166: done queuing things up, now waiting for results queue to drain 7491 1727204003.19168: waiting for pending results... 7491 1727204003.19563: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7491 1727204003.19856: in run() - task 0affcd87-79f5-0a4a-ad01-000000000110 7491 1727204003.19892: variable 'ansible_search_path' from source: unknown 7491 1727204003.19906: variable 'ansible_search_path' from source: unknown 7491 1727204003.19960: calling self._execute() 7491 1727204003.20099: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204003.20113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204003.20136: variable 'omit' from source: magic vars 7491 1727204003.20608: variable 'ansible_distribution_major_version' from source: facts 7491 1727204003.20632: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204003.20887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727204003.24053: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727204003.24135: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727204003.24181: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727204003.24222: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727204003.24257: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727204003.24338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727204003.24381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727204003.24416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204003.24470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727204003.24491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727204003.24625: variable 'ansible_distribution' from source: facts 7491 1727204003.24635: variable 'ansible_distribution_major_version' from source: facts 7491 1727204003.24654: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 7491 1727204003.24776: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727204003.24925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727204003.24951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727204003.24982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204003.25032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727204003.25050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727204003.25097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727204003.25131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727204003.25158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204003.25202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727204003.25224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727204003.25272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727204003.25301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727204003.25331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204003.25377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727204003.25394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727204003.25562: variable 'network_connections' from source: task vars 7491 1727204003.25581: variable 'interface' from source: play vars 7491 1727204003.25650: variable 'interface' from source: play vars 7491 1727204003.25734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727204003.25916: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727204003.25960: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727204003.26003: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727204003.26039: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727204003.26089: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727204003.26121: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727204003.26159: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204003.26192: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727204003.26246: variable '__network_team_connections_defined' from source: role '' defaults 7491 1727204003.26507: variable 'network_connections' from source: task vars 7491 1727204003.26517: variable 'interface' from source: play vars 7491 1727204003.26591: variable 'interface' from source: play vars 7491 1727204003.26622: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7491 1727204003.26633: when evaluation is False, skipping this task 7491 1727204003.26643: _execute() done 7491 1727204003.26650: dumping result to json 7491 1727204003.26656: done dumping result, returning 7491 1727204003.26668: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-0a4a-ad01-000000000110] 7491 1727204003.26678: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000110 skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7491 1727204003.26844: no more pending results, returning what we have 7491 1727204003.26848: results queue empty 7491 1727204003.26849: checking for any_errors_fatal 7491 1727204003.26858: done checking for any_errors_fatal 7491 1727204003.26858: checking for max_fail_percentage 7491 1727204003.26860: done checking for max_fail_percentage 7491 1727204003.26862: checking to see if all hosts have failed and the running result is not ok 7491 1727204003.26863: done checking to see if all hosts have failed 7491 1727204003.26865: getting the remaining hosts for this loop 7491 1727204003.26867: done getting the remaining hosts for this loop 7491 1727204003.26872: getting the next task for host managed-node3 7491 1727204003.26879: done getting next task for host managed-node3 7491 1727204003.26884: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7491 1727204003.26887: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204003.26910: getting variables 7491 1727204003.26912: in VariableManager get_vars() 7491 1727204003.26969: Calling all_inventory to load vars for managed-node3 7491 1727204003.26972: Calling groups_inventory to load vars for managed-node3 7491 1727204003.26975: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204003.26986: Calling all_plugins_play to load vars for managed-node3 7491 1727204003.26989: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204003.26992: Calling groups_plugins_play to load vars for managed-node3 7491 1727204003.27983: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000110 7491 1727204003.27987: WORKER PROCESS EXITING 7491 1727204003.28816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204003.30528: done with get_vars() 7491 1727204003.30559: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7491 1727204003.31550: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:53:23 -0400 (0:00:00.128) 0:00:45.239 ***** 7491 1727204003.31588: entering _queue_task() for managed-node3/yum 7491 1727204003.31938: worker is 1 (out of 1 available) 7491 1727204003.31950: exiting _queue_task() for managed-node3/yum 7491 1727204003.31965: done queuing things up, now waiting for results queue to drain 7491 1727204003.31966: waiting for pending results... 7491 1727204003.34044: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7491 1727204003.34289: in run() - task 0affcd87-79f5-0a4a-ad01-000000000111 7491 1727204003.34303: variable 'ansible_search_path' from source: unknown 7491 1727204003.34307: variable 'ansible_search_path' from source: unknown 7491 1727204003.34457: calling self._execute() 7491 1727204003.34672: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204003.34678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204003.34689: variable 'omit' from source: magic vars 7491 1727204003.35421: variable 'ansible_distribution_major_version' from source: facts 7491 1727204003.35544: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204003.35837: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727204003.39468: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727204003.39548: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727204003.39606: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727204003.39651: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727204003.39691: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727204003.39779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727204003.39811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727204003.39843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204003.39894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727204003.39912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727204003.40022: variable 'ansible_distribution_major_version' from source: facts 7491 1727204003.40043: Evaluated conditional (ansible_distribution_major_version | int < 8): False 7491 1727204003.40050: when evaluation is False, skipping this task 7491 1727204003.40056: _execute() done 7491 1727204003.40062: dumping result to json 7491 1727204003.40072: done dumping result, returning 7491 1727204003.40086: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-0a4a-ad01-000000000111] 7491 1727204003.40096: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000111 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 7491 1727204003.40255: no more pending results, returning what we have 7491 1727204003.40260: results queue empty 7491 1727204003.40261: checking for any_errors_fatal 7491 1727204003.40271: done checking for any_errors_fatal 7491 1727204003.40272: checking for max_fail_percentage 7491 1727204003.40274: done checking for max_fail_percentage 7491 1727204003.40275: checking to see if all hosts have failed and the running result is not ok 7491 1727204003.40276: done checking to see if all hosts have failed 7491 1727204003.40277: getting the remaining hosts for this loop 7491 1727204003.40279: done getting the remaining hosts for this loop 7491 1727204003.40283: getting the next task for host managed-node3 7491 1727204003.40290: done getting next task for host managed-node3 7491 1727204003.40295: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7491 1727204003.40298: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204003.40324: getting variables 7491 1727204003.40326: in VariableManager get_vars() 7491 1727204003.40382: Calling all_inventory to load vars for managed-node3 7491 1727204003.40385: Calling groups_inventory to load vars for managed-node3 7491 1727204003.40387: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204003.40399: Calling all_plugins_play to load vars for managed-node3 7491 1727204003.40402: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204003.40404: Calling groups_plugins_play to load vars for managed-node3 7491 1727204003.41775: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000111 7491 1727204003.41779: WORKER PROCESS EXITING 7491 1727204003.42535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204003.44515: done with get_vars() 7491 1727204003.44547: done getting variables 7491 1727204003.44609: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:53:23 -0400 (0:00:00.130) 0:00:45.370 ***** 7491 1727204003.44648: entering _queue_task() for managed-node3/fail 7491 1727204003.45752: worker is 1 (out of 1 available) 7491 1727204003.45768: exiting _queue_task() for managed-node3/fail 7491 1727204003.45781: done queuing things up, now waiting for results queue to drain 7491 1727204003.45782: waiting for pending results... 7491 1727204003.46711: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7491 1727204003.47079: in run() - task 0affcd87-79f5-0a4a-ad01-000000000112 7491 1727204003.47094: variable 'ansible_search_path' from source: unknown 7491 1727204003.47098: variable 'ansible_search_path' from source: unknown 7491 1727204003.47138: calling self._execute() 7491 1727204003.47354: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204003.47358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204003.47483: variable 'omit' from source: magic vars 7491 1727204003.48040: variable 'ansible_distribution_major_version' from source: facts 7491 1727204003.48051: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204003.48182: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727204003.48403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727204003.50949: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727204003.51043: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727204003.51090: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727204003.51127: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727204003.51155: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727204003.51248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727204003.51281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727204003.51311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204003.51353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727204003.51369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727204003.51427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727204003.51450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727204003.51478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204003.51530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727204003.51544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727204003.51586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727204003.51622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727204003.51644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204003.51688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727204003.51701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727204003.51900: variable 'network_connections' from source: task vars 7491 1727204003.51914: variable 'interface' from source: play vars 7491 1727204003.52003: variable 'interface' from source: play vars 7491 1727204003.52086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727204003.52272: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727204003.52325: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727204003.52357: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727204003.52395: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727204003.52439: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727204003.52461: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727204003.52499: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204003.52524: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727204003.52579: variable '__network_team_connections_defined' from source: role '' defaults 7491 1727204003.52858: variable 'network_connections' from source: task vars 7491 1727204003.52868: variable 'interface' from source: play vars 7491 1727204003.52940: variable 'interface' from source: play vars 7491 1727204003.52968: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7491 1727204003.52971: when evaluation is False, skipping this task 7491 1727204003.52974: _execute() done 7491 1727204003.52976: dumping result to json 7491 1727204003.52979: done dumping result, returning 7491 1727204003.52988: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-0a4a-ad01-000000000112] 7491 1727204003.52993: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000112 7491 1727204003.53103: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000112 7491 1727204003.53106: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7491 1727204003.53173: no more pending results, returning what we have 7491 1727204003.53177: results queue empty 7491 1727204003.53178: checking for any_errors_fatal 7491 1727204003.53184: done checking for any_errors_fatal 7491 1727204003.53184: checking for max_fail_percentage 7491 1727204003.53186: done checking for max_fail_percentage 7491 1727204003.53187: checking to see if all hosts have failed and the running result is not ok 7491 1727204003.53188: done checking to see if all hosts have failed 7491 1727204003.53189: getting the remaining hosts for this loop 7491 1727204003.53191: done getting the remaining hosts for this loop 7491 1727204003.53195: getting the next task for host managed-node3 7491 1727204003.53202: done getting next task for host managed-node3 7491 1727204003.53206: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 7491 1727204003.53209: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204003.53235: getting variables 7491 1727204003.53237: in VariableManager get_vars() 7491 1727204003.53286: Calling all_inventory to load vars for managed-node3 7491 1727204003.53289: Calling groups_inventory to load vars for managed-node3 7491 1727204003.53291: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204003.53301: Calling all_plugins_play to load vars for managed-node3 7491 1727204003.53303: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204003.53306: Calling groups_plugins_play to load vars for managed-node3 7491 1727204003.55391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204003.59768: done with get_vars() 7491 1727204003.59792: done getting variables 7491 1727204003.59851: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:53:23 -0400 (0:00:00.152) 0:00:45.522 ***** 7491 1727204003.59887: entering _queue_task() for managed-node3/package 7491 1727204003.60841: worker is 1 (out of 1 available) 7491 1727204003.60853: exiting _queue_task() for managed-node3/package 7491 1727204003.60868: done queuing things up, now waiting for results queue to drain 7491 1727204003.60869: waiting for pending results... 7491 1727204003.61182: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 7491 1727204003.61345: in run() - task 0affcd87-79f5-0a4a-ad01-000000000113 7491 1727204003.61363: variable 'ansible_search_path' from source: unknown 7491 1727204003.61372: variable 'ansible_search_path' from source: unknown 7491 1727204003.61417: calling self._execute() 7491 1727204003.61537: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204003.61548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204003.61562: variable 'omit' from source: magic vars 7491 1727204003.61966: variable 'ansible_distribution_major_version' from source: facts 7491 1727204003.61987: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204003.62206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727204003.62505: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727204003.62562: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727204003.62606: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727204003.62699: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727204003.62834: variable 'network_packages' from source: role '' defaults 7491 1727204003.62957: variable '__network_provider_setup' from source: role '' defaults 7491 1727204003.62974: variable '__network_service_name_default_nm' from source: role '' defaults 7491 1727204003.63050: variable '__network_service_name_default_nm' from source: role '' defaults 7491 1727204003.63070: variable '__network_packages_default_nm' from source: role '' defaults 7491 1727204003.63144: variable '__network_packages_default_nm' from source: role '' defaults 7491 1727204003.63347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727204003.65512: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727204003.65588: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727204003.65636: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727204003.65678: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727204003.65707: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727204003.65809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727204003.65847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727204003.65884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204003.65937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727204003.65957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727204003.66010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727204003.66040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727204003.66069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204003.66113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727204003.66134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727204003.66357: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7491 1727204003.66474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727204003.66500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727204003.66531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204003.66577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727204003.66594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727204003.66688: variable 'ansible_python' from source: facts 7491 1727204003.66722: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7491 1727204003.66815: variable '__network_wpa_supplicant_required' from source: role '' defaults 7491 1727204003.66909: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7491 1727204003.67052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727204003.67084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727204003.67116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204003.67163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727204003.67186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727204003.67243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727204003.67277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727204003.67300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204003.67344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727204003.67361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727204003.67514: variable 'network_connections' from source: task vars 7491 1727204003.67533: variable 'interface' from source: play vars 7491 1727204003.67640: variable 'interface' from source: play vars 7491 1727204003.67716: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727204003.67758: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727204003.67794: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204003.67832: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727204003.67889: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727204003.68623: variable 'network_connections' from source: task vars 7491 1727204003.68634: variable 'interface' from source: play vars 7491 1727204003.68853: variable 'interface' from source: play vars 7491 1727204003.68921: variable '__network_packages_default_wireless' from source: role '' defaults 7491 1727204003.69181: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727204003.69879: variable 'network_connections' from source: task vars 7491 1727204003.69937: variable 'interface' from source: play vars 7491 1727204003.70015: variable 'interface' from source: play vars 7491 1727204003.70173: variable '__network_packages_default_team' from source: role '' defaults 7491 1727204003.70374: variable '__network_team_connections_defined' from source: role '' defaults 7491 1727204003.71059: variable 'network_connections' from source: task vars 7491 1727204003.71072: variable 'interface' from source: play vars 7491 1727204003.71259: variable 'interface' from source: play vars 7491 1727204003.71325: variable '__network_service_name_default_initscripts' from source: role '' defaults 7491 1727204003.71396: variable '__network_service_name_default_initscripts' from source: role '' defaults 7491 1727204003.71571: variable '__network_packages_default_initscripts' from source: role '' defaults 7491 1727204003.71638: variable '__network_packages_default_initscripts' from source: role '' defaults 7491 1727204003.72104: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7491 1727204003.73078: variable 'network_connections' from source: task vars 7491 1727204003.73206: variable 'interface' from source: play vars 7491 1727204003.73281: variable 'interface' from source: play vars 7491 1727204003.73427: variable 'ansible_distribution' from source: facts 7491 1727204003.73437: variable '__network_rh_distros' from source: role '' defaults 7491 1727204003.73448: variable 'ansible_distribution_major_version' from source: facts 7491 1727204003.73472: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7491 1727204003.73827: variable 'ansible_distribution' from source: facts 7491 1727204003.73961: variable '__network_rh_distros' from source: role '' defaults 7491 1727204003.73973: variable 'ansible_distribution_major_version' from source: facts 7491 1727204003.73993: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7491 1727204003.74289: variable 'ansible_distribution' from source: facts 7491 1727204003.74396: variable '__network_rh_distros' from source: role '' defaults 7491 1727204003.74409: variable 'ansible_distribution_major_version' from source: facts 7491 1727204003.74456: variable 'network_provider' from source: set_fact 7491 1727204003.74481: variable 'ansible_facts' from source: unknown 7491 1727204003.76196: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 7491 1727204003.76204: when evaluation is False, skipping this task 7491 1727204003.76210: _execute() done 7491 1727204003.76215: dumping result to json 7491 1727204003.76224: done dumping result, returning 7491 1727204003.76238: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-0a4a-ad01-000000000113] 7491 1727204003.76247: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000113 skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 7491 1727204003.76412: no more pending results, returning what we have 7491 1727204003.76416: results queue empty 7491 1727204003.76417: checking for any_errors_fatal 7491 1727204003.76430: done checking for any_errors_fatal 7491 1727204003.76431: checking for max_fail_percentage 7491 1727204003.76433: done checking for max_fail_percentage 7491 1727204003.76434: checking to see if all hosts have failed and the running result is not ok 7491 1727204003.76436: done checking to see if all hosts have failed 7491 1727204003.76436: getting the remaining hosts for this loop 7491 1727204003.76438: done getting the remaining hosts for this loop 7491 1727204003.76443: getting the next task for host managed-node3 7491 1727204003.76452: done getting next task for host managed-node3 7491 1727204003.76456: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7491 1727204003.76459: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204003.76484: getting variables 7491 1727204003.76486: in VariableManager get_vars() 7491 1727204003.76543: Calling all_inventory to load vars for managed-node3 7491 1727204003.76546: Calling groups_inventory to load vars for managed-node3 7491 1727204003.76548: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204003.76560: Calling all_plugins_play to load vars for managed-node3 7491 1727204003.76565: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204003.76569: Calling groups_plugins_play to load vars for managed-node3 7491 1727204003.78073: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000113 7491 1727204003.78077: WORKER PROCESS EXITING 7491 1727204003.79217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204003.83425: done with get_vars() 7491 1727204003.83461: done getting variables 7491 1727204003.83528: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:53:23 -0400 (0:00:00.236) 0:00:45.759 ***** 7491 1727204003.83567: entering _queue_task() for managed-node3/package 7491 1727204003.84008: worker is 1 (out of 1 available) 7491 1727204003.84023: exiting _queue_task() for managed-node3/package 7491 1727204003.84037: done queuing things up, now waiting for results queue to drain 7491 1727204003.84038: waiting for pending results... 7491 1727204003.84840: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7491 1727204003.85215: in run() - task 0affcd87-79f5-0a4a-ad01-000000000114 7491 1727204003.85230: variable 'ansible_search_path' from source: unknown 7491 1727204003.85235: variable 'ansible_search_path' from source: unknown 7491 1727204003.85385: calling self._execute() 7491 1727204003.85615: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204003.85622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204003.85631: variable 'omit' from source: magic vars 7491 1727204003.86485: variable 'ansible_distribution_major_version' from source: facts 7491 1727204003.86497: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204003.86741: variable 'network_state' from source: role '' defaults 7491 1727204003.86751: Evaluated conditional (network_state != {}): False 7491 1727204003.86755: when evaluation is False, skipping this task 7491 1727204003.86758: _execute() done 7491 1727204003.86761: dumping result to json 7491 1727204003.86764: done dumping result, returning 7491 1727204003.86773: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-0a4a-ad01-000000000114] 7491 1727204003.86779: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000114 7491 1727204003.87004: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000114 7491 1727204003.87007: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7491 1727204003.87065: no more pending results, returning what we have 7491 1727204003.87070: results queue empty 7491 1727204003.87072: checking for any_errors_fatal 7491 1727204003.87084: done checking for any_errors_fatal 7491 1727204003.87085: checking for max_fail_percentage 7491 1727204003.87087: done checking for max_fail_percentage 7491 1727204003.87088: checking to see if all hosts have failed and the running result is not ok 7491 1727204003.87090: done checking to see if all hosts have failed 7491 1727204003.87090: getting the remaining hosts for this loop 7491 1727204003.87092: done getting the remaining hosts for this loop 7491 1727204003.87096: getting the next task for host managed-node3 7491 1727204003.87103: done getting next task for host managed-node3 7491 1727204003.87108: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7491 1727204003.87111: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204003.87138: getting variables 7491 1727204003.87140: in VariableManager get_vars() 7491 1727204003.87192: Calling all_inventory to load vars for managed-node3 7491 1727204003.87194: Calling groups_inventory to load vars for managed-node3 7491 1727204003.87197: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204003.87210: Calling all_plugins_play to load vars for managed-node3 7491 1727204003.87212: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204003.87215: Calling groups_plugins_play to load vars for managed-node3 7491 1727204003.90311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204003.92059: done with get_vars() 7491 1727204003.92131: done getting variables 7491 1727204003.92194: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:53:23 -0400 (0:00:00.086) 0:00:45.846 ***** 7491 1727204003.92233: entering _queue_task() for managed-node3/package 7491 1727204003.92587: worker is 1 (out of 1 available) 7491 1727204003.92599: exiting _queue_task() for managed-node3/package 7491 1727204003.92614: done queuing things up, now waiting for results queue to drain 7491 1727204003.92616: waiting for pending results... 7491 1727204003.93585: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7491 1727204003.93933: in run() - task 0affcd87-79f5-0a4a-ad01-000000000115 7491 1727204003.93962: variable 'ansible_search_path' from source: unknown 7491 1727204003.93975: variable 'ansible_search_path' from source: unknown 7491 1727204003.94025: calling self._execute() 7491 1727204003.94167: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204003.94180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204003.94197: variable 'omit' from source: magic vars 7491 1727204003.94602: variable 'ansible_distribution_major_version' from source: facts 7491 1727204003.94614: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204003.94710: variable 'network_state' from source: role '' defaults 7491 1727204003.94724: Evaluated conditional (network_state != {}): False 7491 1727204003.94728: when evaluation is False, skipping this task 7491 1727204003.94731: _execute() done 7491 1727204003.94734: dumping result to json 7491 1727204003.94736: done dumping result, returning 7491 1727204003.94742: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-0a4a-ad01-000000000115] 7491 1727204003.94749: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000115 7491 1727204003.94850: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000115 7491 1727204003.94853: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7491 1727204003.94902: no more pending results, returning what we have 7491 1727204003.94906: results queue empty 7491 1727204003.94907: checking for any_errors_fatal 7491 1727204003.94914: done checking for any_errors_fatal 7491 1727204003.94915: checking for max_fail_percentage 7491 1727204003.94916: done checking for max_fail_percentage 7491 1727204003.94917: checking to see if all hosts have failed and the running result is not ok 7491 1727204003.94921: done checking to see if all hosts have failed 7491 1727204003.94921: getting the remaining hosts for this loop 7491 1727204003.94923: done getting the remaining hosts for this loop 7491 1727204003.94927: getting the next task for host managed-node3 7491 1727204003.94934: done getting next task for host managed-node3 7491 1727204003.94938: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7491 1727204003.94941: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204003.94966: getting variables 7491 1727204003.94968: in VariableManager get_vars() 7491 1727204003.95021: Calling all_inventory to load vars for managed-node3 7491 1727204003.95025: Calling groups_inventory to load vars for managed-node3 7491 1727204003.95027: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204003.95038: Calling all_plugins_play to load vars for managed-node3 7491 1727204003.95040: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204003.95042: Calling groups_plugins_play to load vars for managed-node3 7491 1727204003.96025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204003.98091: done with get_vars() 7491 1727204003.98130: done getting variables 7491 1727204003.98202: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:53:23 -0400 (0:00:00.060) 0:00:45.906 ***** 7491 1727204003.98247: entering _queue_task() for managed-node3/service 7491 1727204003.98598: worker is 1 (out of 1 available) 7491 1727204003.98611: exiting _queue_task() for managed-node3/service 7491 1727204003.98628: done queuing things up, now waiting for results queue to drain 7491 1727204003.98629: waiting for pending results... 7491 1727204003.98955: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7491 1727204003.99112: in run() - task 0affcd87-79f5-0a4a-ad01-000000000116 7491 1727204003.99130: variable 'ansible_search_path' from source: unknown 7491 1727204003.99148: variable 'ansible_search_path' from source: unknown 7491 1727204003.99183: calling self._execute() 7491 1727204003.99281: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204003.99287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204003.99295: variable 'omit' from source: magic vars 7491 1727204003.99595: variable 'ansible_distribution_major_version' from source: facts 7491 1727204003.99606: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204003.99901: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727204004.00091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727204004.02841: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727204004.02910: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727204004.02991: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727204004.02999: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727204004.03028: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727204004.03277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727204004.03281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727204004.03284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204004.03286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727204004.03288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727204004.03290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727204004.03443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727204004.03447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204004.03450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727204004.03477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727204004.03500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727204004.03526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727204004.03553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204004.03591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727204004.03615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727204004.04100: variable 'network_connections' from source: task vars 7491 1727204004.04115: variable 'interface' from source: play vars 7491 1727204004.04199: variable 'interface' from source: play vars 7491 1727204004.04269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727204004.04649: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727204004.04895: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727204004.04930: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727204004.04978: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727204004.05019: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727204004.05051: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727204004.05103: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204004.05109: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727204004.05166: variable '__network_team_connections_defined' from source: role '' defaults 7491 1727204004.05477: variable 'network_connections' from source: task vars 7491 1727204004.05488: variable 'interface' from source: play vars 7491 1727204004.05571: variable 'interface' from source: play vars 7491 1727204004.05605: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7491 1727204004.05613: when evaluation is False, skipping this task 7491 1727204004.05621: _execute() done 7491 1727204004.05629: dumping result to json 7491 1727204004.05637: done dumping result, returning 7491 1727204004.05650: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-0a4a-ad01-000000000116] 7491 1727204004.05671: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000116 skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7491 1727204004.05902: no more pending results, returning what we have 7491 1727204004.05906: results queue empty 7491 1727204004.05907: checking for any_errors_fatal 7491 1727204004.05915: done checking for any_errors_fatal 7491 1727204004.05915: checking for max_fail_percentage 7491 1727204004.05917: done checking for max_fail_percentage 7491 1727204004.05919: checking to see if all hosts have failed and the running result is not ok 7491 1727204004.05920: done checking to see if all hosts have failed 7491 1727204004.05920: getting the remaining hosts for this loop 7491 1727204004.05922: done getting the remaining hosts for this loop 7491 1727204004.05927: getting the next task for host managed-node3 7491 1727204004.05934: done getting next task for host managed-node3 7491 1727204004.05939: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7491 1727204004.05941: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204004.05966: getting variables 7491 1727204004.05969: in VariableManager get_vars() 7491 1727204004.06027: Calling all_inventory to load vars for managed-node3 7491 1727204004.06031: Calling groups_inventory to load vars for managed-node3 7491 1727204004.06033: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204004.06045: Calling all_plugins_play to load vars for managed-node3 7491 1727204004.06048: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204004.06051: Calling groups_plugins_play to load vars for managed-node3 7491 1727204004.08377: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000116 7491 1727204004.08381: WORKER PROCESS EXITING 7491 1727204004.10062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204004.10993: done with get_vars() 7491 1727204004.11016: done getting variables 7491 1727204004.11065: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:53:24 -0400 (0:00:00.128) 0:00:46.034 ***** 7491 1727204004.11090: entering _queue_task() for managed-node3/service 7491 1727204004.11343: worker is 1 (out of 1 available) 7491 1727204004.11356: exiting _queue_task() for managed-node3/service 7491 1727204004.11371: done queuing things up, now waiting for results queue to drain 7491 1727204004.11373: waiting for pending results... 7491 1727204004.11572: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7491 1727204004.11740: in run() - task 0affcd87-79f5-0a4a-ad01-000000000117 7491 1727204004.11773: variable 'ansible_search_path' from source: unknown 7491 1727204004.11783: variable 'ansible_search_path' from source: unknown 7491 1727204004.11832: calling self._execute() 7491 1727204004.11961: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204004.11978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204004.11999: variable 'omit' from source: magic vars 7491 1727204004.12442: variable 'ansible_distribution_major_version' from source: facts 7491 1727204004.12468: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204004.12676: variable 'network_provider' from source: set_fact 7491 1727204004.12689: variable 'network_state' from source: role '' defaults 7491 1727204004.12705: Evaluated conditional (network_provider == "nm" or network_state != {}): True 7491 1727204004.12721: variable 'omit' from source: magic vars 7491 1727204004.12808: variable 'omit' from source: magic vars 7491 1727204004.12851: variable 'network_service_name' from source: role '' defaults 7491 1727204004.12932: variable 'network_service_name' from source: role '' defaults 7491 1727204004.13013: variable '__network_provider_setup' from source: role '' defaults 7491 1727204004.13018: variable '__network_service_name_default_nm' from source: role '' defaults 7491 1727204004.13073: variable '__network_service_name_default_nm' from source: role '' defaults 7491 1727204004.13081: variable '__network_packages_default_nm' from source: role '' defaults 7491 1727204004.13132: variable '__network_packages_default_nm' from source: role '' defaults 7491 1727204004.13293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727204004.14912: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727204004.14973: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727204004.15001: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727204004.15031: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727204004.15051: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727204004.15113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727204004.15138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727204004.15155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204004.15186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727204004.15198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727204004.15232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727204004.15250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727204004.15267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204004.15296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727204004.15306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727204004.15471: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7491 1727204004.15552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727204004.15572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727204004.15589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204004.15618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727204004.15630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727204004.15696: variable 'ansible_python' from source: facts 7491 1727204004.15716: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7491 1727204004.15777: variable '__network_wpa_supplicant_required' from source: role '' defaults 7491 1727204004.15837: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7491 1727204004.15921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727204004.15943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727204004.15959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204004.15985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727204004.15995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727204004.16035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727204004.16054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727204004.16073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204004.16097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727204004.16109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727204004.16205: variable 'network_connections' from source: task vars 7491 1727204004.16210: variable 'interface' from source: play vars 7491 1727204004.16270: variable 'interface' from source: play vars 7491 1727204004.16347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727204004.16491: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727204004.16527: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727204004.16560: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727204004.16594: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727204004.16642: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727204004.16666: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727204004.16690: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204004.16715: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727204004.16753: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727204004.16942: variable 'network_connections' from source: task vars 7491 1727204004.16948: variable 'interface' from source: play vars 7491 1727204004.17005: variable 'interface' from source: play vars 7491 1727204004.17037: variable '__network_packages_default_wireless' from source: role '' defaults 7491 1727204004.17098: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727204004.17293: variable 'network_connections' from source: task vars 7491 1727204004.17297: variable 'interface' from source: play vars 7491 1727204004.17351: variable 'interface' from source: play vars 7491 1727204004.17371: variable '__network_packages_default_team' from source: role '' defaults 7491 1727204004.17427: variable '__network_team_connections_defined' from source: role '' defaults 7491 1727204004.17628: variable 'network_connections' from source: task vars 7491 1727204004.17631: variable 'interface' from source: play vars 7491 1727204004.17685: variable 'interface' from source: play vars 7491 1727204004.17725: variable '__network_service_name_default_initscripts' from source: role '' defaults 7491 1727204004.17770: variable '__network_service_name_default_initscripts' from source: role '' defaults 7491 1727204004.17776: variable '__network_packages_default_initscripts' from source: role '' defaults 7491 1727204004.17820: variable '__network_packages_default_initscripts' from source: role '' defaults 7491 1727204004.17965: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7491 1727204004.18299: variable 'network_connections' from source: task vars 7491 1727204004.18302: variable 'interface' from source: play vars 7491 1727204004.18352: variable 'interface' from source: play vars 7491 1727204004.18355: variable 'ansible_distribution' from source: facts 7491 1727204004.18358: variable '__network_rh_distros' from source: role '' defaults 7491 1727204004.18367: variable 'ansible_distribution_major_version' from source: facts 7491 1727204004.18377: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7491 1727204004.18497: variable 'ansible_distribution' from source: facts 7491 1727204004.18504: variable '__network_rh_distros' from source: role '' defaults 7491 1727204004.18507: variable 'ansible_distribution_major_version' from source: facts 7491 1727204004.18516: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7491 1727204004.18634: variable 'ansible_distribution' from source: facts 7491 1727204004.18637: variable '__network_rh_distros' from source: role '' defaults 7491 1727204004.18640: variable 'ansible_distribution_major_version' from source: facts 7491 1727204004.18669: variable 'network_provider' from source: set_fact 7491 1727204004.18688: variable 'omit' from source: magic vars 7491 1727204004.18711: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727204004.18737: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727204004.18753: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727204004.18767: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204004.18779: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204004.18801: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727204004.18804: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204004.18806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204004.18878: Set connection var ansible_timeout to 10 7491 1727204004.18884: Set connection var ansible_pipelining to False 7491 1727204004.18890: Set connection var ansible_shell_type to sh 7491 1727204004.18892: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727204004.18901: Set connection var ansible_shell_executable to /bin/sh 7491 1727204004.18905: Set connection var ansible_connection to ssh 7491 1727204004.18927: variable 'ansible_shell_executable' from source: unknown 7491 1727204004.18930: variable 'ansible_connection' from source: unknown 7491 1727204004.18932: variable 'ansible_module_compression' from source: unknown 7491 1727204004.18939: variable 'ansible_shell_type' from source: unknown 7491 1727204004.18942: variable 'ansible_shell_executable' from source: unknown 7491 1727204004.18944: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204004.18946: variable 'ansible_pipelining' from source: unknown 7491 1727204004.18948: variable 'ansible_timeout' from source: unknown 7491 1727204004.18950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204004.19027: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727204004.19036: variable 'omit' from source: magic vars 7491 1727204004.19048: starting attempt loop 7491 1727204004.19051: running the handler 7491 1727204004.19104: variable 'ansible_facts' from source: unknown 7491 1727204004.19610: _low_level_execute_command(): starting 7491 1727204004.19614: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727204004.20136: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204004.20152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204004.20169: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727204004.20182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204004.20193: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204004.20243: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204004.20254: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204004.20311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204004.21934: stdout chunk (state=3): >>>/root <<< 7491 1727204004.22036: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204004.22094: stderr chunk (state=3): >>><<< 7491 1727204004.22097: stdout chunk (state=3): >>><<< 7491 1727204004.22115: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204004.22128: _low_level_execute_command(): starting 7491 1727204004.22137: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204004.2211585-9542-94220967020964 `" && echo ansible-tmp-1727204004.2211585-9542-94220967020964="` echo /root/.ansible/tmp/ansible-tmp-1727204004.2211585-9542-94220967020964 `" ) && sleep 0' 7491 1727204004.22593: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204004.22604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204004.22635: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204004.22648: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204004.22699: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204004.22711: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204004.22770: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204004.24565: stdout chunk (state=3): >>>ansible-tmp-1727204004.2211585-9542-94220967020964=/root/.ansible/tmp/ansible-tmp-1727204004.2211585-9542-94220967020964 <<< 7491 1727204004.24679: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204004.24734: stderr chunk (state=3): >>><<< 7491 1727204004.24740: stdout chunk (state=3): >>><<< 7491 1727204004.24755: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204004.2211585-9542-94220967020964=/root/.ansible/tmp/ansible-tmp-1727204004.2211585-9542-94220967020964 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204004.24782: variable 'ansible_module_compression' from source: unknown 7491 1727204004.24827: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 7491 1727204004.24875: variable 'ansible_facts' from source: unknown 7491 1727204004.25005: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204004.2211585-9542-94220967020964/AnsiballZ_systemd.py 7491 1727204004.25122: Sending initial data 7491 1727204004.25125: Sent initial data (153 bytes) 7491 1727204004.25813: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204004.25825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204004.25852: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204004.25872: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204004.25925: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204004.25936: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204004.25987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204004.27639: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727204004.27677: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727204004.27723: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpa236ynkm /root/.ansible/tmp/ansible-tmp-1727204004.2211585-9542-94220967020964/AnsiballZ_systemd.py <<< 7491 1727204004.27753: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727204004.29507: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204004.29628: stderr chunk (state=3): >>><<< 7491 1727204004.29631: stdout chunk (state=3): >>><<< 7491 1727204004.29647: done transferring module to remote 7491 1727204004.29656: _low_level_execute_command(): starting 7491 1727204004.29661: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204004.2211585-9542-94220967020964/ /root/.ansible/tmp/ansible-tmp-1727204004.2211585-9542-94220967020964/AnsiballZ_systemd.py && sleep 0' 7491 1727204004.30135: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204004.30141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204004.30171: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204004.30183: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204004.30236: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204004.30248: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204004.30302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204004.31967: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204004.32018: stderr chunk (state=3): >>><<< 7491 1727204004.32024: stdout chunk (state=3): >>><<< 7491 1727204004.32036: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204004.32040: _low_level_execute_command(): starting 7491 1727204004.32045: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204004.2211585-9542-94220967020964/AnsiballZ_systemd.py && sleep 0' 7491 1727204004.32511: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204004.32517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204004.32551: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204004.32563: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204004.32617: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204004.32633: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204004.32685: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204004.57387: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "616", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ExecMainStartTimestampMonotonic": "12637094", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "616", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Tue 2024-09-24 14:47:46 EDT] ; stop_time=[n/a] ; pid=616 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Tue 2024-09-24 14:47:46 EDT] ; stop_time=[n/a] ; pid=616 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.sl<<< 7491 1727204004.57419: stdout chunk (state=3): >>>ice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "15179776", "MemoryAvailable": "infinity", "CPUUsageNSec": "231683000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "<<< 7491 1727204004.57431: stdout chunk (state=3): >>>SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.service shutdown.target NetworkManager-wait-online.service multi-user.target network.target", "After": "basic.target dbus.socket system.slice dbus-broker.service network-pre.target systemd-journald.socket cloud-init-local.service sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:47:46 EDT", "StateChangeTimestampMonotonic": "12973041", "InactiveExitTimestamp": "Tue 2024-09-24 14:47:46 EDT", "InactiveExitTimestampMonotonic": "12637298", "ActiveEnterTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ActiveEnterTimestampMonotonic": "12973041", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ConditionTimestampMonotonic": "12630855", "AssertTimestamp": "Tue 2024-09-24 14:47:46 EDT", "AssertTimestampMonotonic": "12630857", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f94263a9def7408cb754f60792d8c658", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 7491 1727204004.58903: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727204004.58973: stderr chunk (state=3): >>><<< 7491 1727204004.58976: stdout chunk (state=3): >>><<< 7491 1727204004.58995: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "616", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ExecMainStartTimestampMonotonic": "12637094", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "616", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Tue 2024-09-24 14:47:46 EDT] ; stop_time=[n/a] ; pid=616 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Tue 2024-09-24 14:47:46 EDT] ; stop_time=[n/a] ; pid=616 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "15179776", "MemoryAvailable": "infinity", "CPUUsageNSec": "231683000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.service shutdown.target NetworkManager-wait-online.service multi-user.target network.target", "After": "basic.target dbus.socket system.slice dbus-broker.service network-pre.target systemd-journald.socket cloud-init-local.service sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:47:46 EDT", "StateChangeTimestampMonotonic": "12973041", "InactiveExitTimestamp": "Tue 2024-09-24 14:47:46 EDT", "InactiveExitTimestampMonotonic": "12637298", "ActiveEnterTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ActiveEnterTimestampMonotonic": "12973041", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ConditionTimestampMonotonic": "12630855", "AssertTimestamp": "Tue 2024-09-24 14:47:46 EDT", "AssertTimestampMonotonic": "12630857", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f94263a9def7408cb754f60792d8c658", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727204004.59113: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204004.2211585-9542-94220967020964/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727204004.59129: _low_level_execute_command(): starting 7491 1727204004.59134: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204004.2211585-9542-94220967020964/ > /dev/null 2>&1 && sleep 0' 7491 1727204004.59615: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727204004.59623: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204004.59630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204004.59668: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204004.59681: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204004.59736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204004.59747: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204004.59799: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204004.61531: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204004.61585: stderr chunk (state=3): >>><<< 7491 1727204004.61588: stdout chunk (state=3): >>><<< 7491 1727204004.61603: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204004.61609: handler run complete 7491 1727204004.61654: attempt loop complete, returning result 7491 1727204004.61657: _execute() done 7491 1727204004.61660: dumping result to json 7491 1727204004.61673: done dumping result, returning 7491 1727204004.61681: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-0a4a-ad01-000000000117] 7491 1727204004.61686: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000117 7491 1727204004.62040: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000117 7491 1727204004.62043: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7491 1727204004.62104: no more pending results, returning what we have 7491 1727204004.62108: results queue empty 7491 1727204004.62109: checking for any_errors_fatal 7491 1727204004.62116: done checking for any_errors_fatal 7491 1727204004.62117: checking for max_fail_percentage 7491 1727204004.62121: done checking for max_fail_percentage 7491 1727204004.62122: checking to see if all hosts have failed and the running result is not ok 7491 1727204004.62123: done checking to see if all hosts have failed 7491 1727204004.62124: getting the remaining hosts for this loop 7491 1727204004.62126: done getting the remaining hosts for this loop 7491 1727204004.62129: getting the next task for host managed-node3 7491 1727204004.62134: done getting next task for host managed-node3 7491 1727204004.62137: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7491 1727204004.62140: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204004.62158: getting variables 7491 1727204004.62160: in VariableManager get_vars() 7491 1727204004.62203: Calling all_inventory to load vars for managed-node3 7491 1727204004.62206: Calling groups_inventory to load vars for managed-node3 7491 1727204004.62207: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204004.62216: Calling all_plugins_play to load vars for managed-node3 7491 1727204004.62221: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204004.62224: Calling groups_plugins_play to load vars for managed-node3 7491 1727204004.63058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204004.64098: done with get_vars() 7491 1727204004.64117: done getting variables 7491 1727204004.64163: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:53:24 -0400 (0:00:00.530) 0:00:46.565 ***** 7491 1727204004.64190: entering _queue_task() for managed-node3/service 7491 1727204004.64427: worker is 1 (out of 1 available) 7491 1727204004.64441: exiting _queue_task() for managed-node3/service 7491 1727204004.64454: done queuing things up, now waiting for results queue to drain 7491 1727204004.64455: waiting for pending results... 7491 1727204004.64651: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7491 1727204004.64746: in run() - task 0affcd87-79f5-0a4a-ad01-000000000118 7491 1727204004.64758: variable 'ansible_search_path' from source: unknown 7491 1727204004.64763: variable 'ansible_search_path' from source: unknown 7491 1727204004.64795: calling self._execute() 7491 1727204004.64879: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204004.64882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204004.64891: variable 'omit' from source: magic vars 7491 1727204004.65193: variable 'ansible_distribution_major_version' from source: facts 7491 1727204004.65203: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204004.65289: variable 'network_provider' from source: set_fact 7491 1727204004.65294: Evaluated conditional (network_provider == "nm"): True 7491 1727204004.65361: variable '__network_wpa_supplicant_required' from source: role '' defaults 7491 1727204004.65430: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7491 1727204004.65555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727204004.67126: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727204004.67176: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727204004.67205: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727204004.67233: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727204004.67253: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727204004.67322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727204004.67344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727204004.67362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204004.67391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727204004.67407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727204004.67442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727204004.67458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727204004.67478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204004.67511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727204004.67524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727204004.67551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727204004.67569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727204004.67586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204004.67616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727204004.67631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727204004.67739: variable 'network_connections' from source: task vars 7491 1727204004.67749: variable 'interface' from source: play vars 7491 1727204004.67802: variable 'interface' from source: play vars 7491 1727204004.67858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727204004.67971: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727204004.67998: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727204004.68019: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727204004.68042: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727204004.68083: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7491 1727204004.68099: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7491 1727204004.68115: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204004.68134: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7491 1727204004.68179: variable '__network_wireless_connections_defined' from source: role '' defaults 7491 1727204004.68340: variable 'network_connections' from source: task vars 7491 1727204004.68343: variable 'interface' from source: play vars 7491 1727204004.68393: variable 'interface' from source: play vars 7491 1727204004.68416: Evaluated conditional (__network_wpa_supplicant_required): False 7491 1727204004.68422: when evaluation is False, skipping this task 7491 1727204004.68424: _execute() done 7491 1727204004.68427: dumping result to json 7491 1727204004.68429: done dumping result, returning 7491 1727204004.68434: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-0a4a-ad01-000000000118] 7491 1727204004.68445: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000118 7491 1727204004.68545: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000118 7491 1727204004.68548: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 7491 1727204004.68601: no more pending results, returning what we have 7491 1727204004.68605: results queue empty 7491 1727204004.68606: checking for any_errors_fatal 7491 1727204004.68631: done checking for any_errors_fatal 7491 1727204004.68631: checking for max_fail_percentage 7491 1727204004.68633: done checking for max_fail_percentage 7491 1727204004.68634: checking to see if all hosts have failed and the running result is not ok 7491 1727204004.68635: done checking to see if all hosts have failed 7491 1727204004.68636: getting the remaining hosts for this loop 7491 1727204004.68638: done getting the remaining hosts for this loop 7491 1727204004.68641: getting the next task for host managed-node3 7491 1727204004.68647: done getting next task for host managed-node3 7491 1727204004.68651: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 7491 1727204004.68653: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204004.68676: getting variables 7491 1727204004.68677: in VariableManager get_vars() 7491 1727204004.68732: Calling all_inventory to load vars for managed-node3 7491 1727204004.68735: Calling groups_inventory to load vars for managed-node3 7491 1727204004.68738: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204004.68746: Calling all_plugins_play to load vars for managed-node3 7491 1727204004.68748: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204004.68751: Calling groups_plugins_play to load vars for managed-node3 7491 1727204004.69584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204004.70532: done with get_vars() 7491 1727204004.70552: done getting variables 7491 1727204004.70598: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:53:24 -0400 (0:00:00.064) 0:00:46.630 ***** 7491 1727204004.70624: entering _queue_task() for managed-node3/service 7491 1727204004.70866: worker is 1 (out of 1 available) 7491 1727204004.70880: exiting _queue_task() for managed-node3/service 7491 1727204004.70894: done queuing things up, now waiting for results queue to drain 7491 1727204004.70895: waiting for pending results... 7491 1727204004.71084: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 7491 1727204004.71176: in run() - task 0affcd87-79f5-0a4a-ad01-000000000119 7491 1727204004.71189: variable 'ansible_search_path' from source: unknown 7491 1727204004.71193: variable 'ansible_search_path' from source: unknown 7491 1727204004.71224: calling self._execute() 7491 1727204004.71321: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204004.71328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204004.71338: variable 'omit' from source: magic vars 7491 1727204004.71635: variable 'ansible_distribution_major_version' from source: facts 7491 1727204004.71645: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204004.71733: variable 'network_provider' from source: set_fact 7491 1727204004.71738: Evaluated conditional (network_provider == "initscripts"): False 7491 1727204004.71740: when evaluation is False, skipping this task 7491 1727204004.71743: _execute() done 7491 1727204004.71751: dumping result to json 7491 1727204004.71756: done dumping result, returning 7491 1727204004.71762: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-0a4a-ad01-000000000119] 7491 1727204004.71771: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000119 7491 1727204004.71857: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000119 7491 1727204004.71860: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7491 1727204004.71918: no more pending results, returning what we have 7491 1727204004.71922: results queue empty 7491 1727204004.71923: checking for any_errors_fatal 7491 1727204004.71931: done checking for any_errors_fatal 7491 1727204004.71932: checking for max_fail_percentage 7491 1727204004.71933: done checking for max_fail_percentage 7491 1727204004.71934: checking to see if all hosts have failed and the running result is not ok 7491 1727204004.71935: done checking to see if all hosts have failed 7491 1727204004.71936: getting the remaining hosts for this loop 7491 1727204004.71938: done getting the remaining hosts for this loop 7491 1727204004.71941: getting the next task for host managed-node3 7491 1727204004.71947: done getting next task for host managed-node3 7491 1727204004.71950: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7491 1727204004.71953: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204004.71979: getting variables 7491 1727204004.71981: in VariableManager get_vars() 7491 1727204004.72024: Calling all_inventory to load vars for managed-node3 7491 1727204004.72027: Calling groups_inventory to load vars for managed-node3 7491 1727204004.72029: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204004.72038: Calling all_plugins_play to load vars for managed-node3 7491 1727204004.72040: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204004.72043: Calling groups_plugins_play to load vars for managed-node3 7491 1727204004.72994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204004.73905: done with get_vars() 7491 1727204004.73926: done getting variables 7491 1727204004.73970: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:53:24 -0400 (0:00:00.033) 0:00:46.663 ***** 7491 1727204004.73995: entering _queue_task() for managed-node3/copy 7491 1727204004.74233: worker is 1 (out of 1 available) 7491 1727204004.74247: exiting _queue_task() for managed-node3/copy 7491 1727204004.74261: done queuing things up, now waiting for results queue to drain 7491 1727204004.74263: waiting for pending results... 7491 1727204004.74458: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7491 1727204004.74553: in run() - task 0affcd87-79f5-0a4a-ad01-00000000011a 7491 1727204004.74567: variable 'ansible_search_path' from source: unknown 7491 1727204004.74570: variable 'ansible_search_path' from source: unknown 7491 1727204004.74601: calling self._execute() 7491 1727204004.74686: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204004.74691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204004.74699: variable 'omit' from source: magic vars 7491 1727204004.74990: variable 'ansible_distribution_major_version' from source: facts 7491 1727204004.75000: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204004.75086: variable 'network_provider' from source: set_fact 7491 1727204004.75091: Evaluated conditional (network_provider == "initscripts"): False 7491 1727204004.75094: when evaluation is False, skipping this task 7491 1727204004.75097: _execute() done 7491 1727204004.75099: dumping result to json 7491 1727204004.75104: done dumping result, returning 7491 1727204004.75110: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-0a4a-ad01-00000000011a] 7491 1727204004.75116: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000011a 7491 1727204004.75208: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000011a 7491 1727204004.75211: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 7491 1727204004.75266: no more pending results, returning what we have 7491 1727204004.75270: results queue empty 7491 1727204004.75271: checking for any_errors_fatal 7491 1727204004.75278: done checking for any_errors_fatal 7491 1727204004.75279: checking for max_fail_percentage 7491 1727204004.75281: done checking for max_fail_percentage 7491 1727204004.75282: checking to see if all hosts have failed and the running result is not ok 7491 1727204004.75283: done checking to see if all hosts have failed 7491 1727204004.75283: getting the remaining hosts for this loop 7491 1727204004.75285: done getting the remaining hosts for this loop 7491 1727204004.75289: getting the next task for host managed-node3 7491 1727204004.75294: done getting next task for host managed-node3 7491 1727204004.75297: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7491 1727204004.75300: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204004.75321: getting variables 7491 1727204004.75323: in VariableManager get_vars() 7491 1727204004.75374: Calling all_inventory to load vars for managed-node3 7491 1727204004.75377: Calling groups_inventory to load vars for managed-node3 7491 1727204004.75379: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204004.75388: Calling all_plugins_play to load vars for managed-node3 7491 1727204004.75390: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204004.75393: Calling groups_plugins_play to load vars for managed-node3 7491 1727204004.76197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204004.77138: done with get_vars() 7491 1727204004.77158: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:53:24 -0400 (0:00:00.032) 0:00:46.696 ***** 7491 1727204004.77228: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 7491 1727204004.77472: worker is 1 (out of 1 available) 7491 1727204004.77485: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 7491 1727204004.77499: done queuing things up, now waiting for results queue to drain 7491 1727204004.77500: waiting for pending results... 7491 1727204004.77692: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7491 1727204004.77792: in run() - task 0affcd87-79f5-0a4a-ad01-00000000011b 7491 1727204004.77805: variable 'ansible_search_path' from source: unknown 7491 1727204004.77808: variable 'ansible_search_path' from source: unknown 7491 1727204004.77839: calling self._execute() 7491 1727204004.77925: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204004.77929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204004.77940: variable 'omit' from source: magic vars 7491 1727204004.78223: variable 'ansible_distribution_major_version' from source: facts 7491 1727204004.78239: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204004.78245: variable 'omit' from source: magic vars 7491 1727204004.78291: variable 'omit' from source: magic vars 7491 1727204004.78413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727204004.80253: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727204004.80300: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727204004.80331: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727204004.80358: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727204004.80380: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727204004.80439: variable 'network_provider' from source: set_fact 7491 1727204004.80542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727204004.80568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727204004.80584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204004.80611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727204004.80624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727204004.80678: variable 'omit' from source: magic vars 7491 1727204004.80757: variable 'omit' from source: magic vars 7491 1727204004.80833: variable 'network_connections' from source: task vars 7491 1727204004.80844: variable 'interface' from source: play vars 7491 1727204004.80890: variable 'interface' from source: play vars 7491 1727204004.81002: variable 'omit' from source: magic vars 7491 1727204004.81010: variable '__lsr_ansible_managed' from source: task vars 7491 1727204004.81054: variable '__lsr_ansible_managed' from source: task vars 7491 1727204004.81253: Loaded config def from plugin (lookup/template) 7491 1727204004.81257: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 7491 1727204004.81282: File lookup term: get_ansible_managed.j2 7491 1727204004.81286: variable 'ansible_search_path' from source: unknown 7491 1727204004.81290: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 7491 1727204004.81301: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 7491 1727204004.81314: variable 'ansible_search_path' from source: unknown 7491 1727204004.84726: variable 'ansible_managed' from source: unknown 7491 1727204004.84818: variable 'omit' from source: magic vars 7491 1727204004.84840: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727204004.84861: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727204004.84879: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727204004.84901: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204004.84910: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204004.84935: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727204004.84938: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204004.84940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204004.85011: Set connection var ansible_timeout to 10 7491 1727204004.85016: Set connection var ansible_pipelining to False 7491 1727204004.85024: Set connection var ansible_shell_type to sh 7491 1727204004.85029: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727204004.85035: Set connection var ansible_shell_executable to /bin/sh 7491 1727204004.85040: Set connection var ansible_connection to ssh 7491 1727204004.85058: variable 'ansible_shell_executable' from source: unknown 7491 1727204004.85060: variable 'ansible_connection' from source: unknown 7491 1727204004.85063: variable 'ansible_module_compression' from source: unknown 7491 1727204004.85067: variable 'ansible_shell_type' from source: unknown 7491 1727204004.85069: variable 'ansible_shell_executable' from source: unknown 7491 1727204004.85072: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204004.85077: variable 'ansible_pipelining' from source: unknown 7491 1727204004.85079: variable 'ansible_timeout' from source: unknown 7491 1727204004.85083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204004.85183: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7491 1727204004.85194: variable 'omit' from source: magic vars 7491 1727204004.85198: starting attempt loop 7491 1727204004.85201: running the handler 7491 1727204004.85214: _low_level_execute_command(): starting 7491 1727204004.85223: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727204004.85740: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204004.85759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204004.85783: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 7491 1727204004.85794: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204004.85843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204004.85853: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204004.85913: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204004.87517: stdout chunk (state=3): >>>/root <<< 7491 1727204004.87615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204004.87676: stderr chunk (state=3): >>><<< 7491 1727204004.87679: stdout chunk (state=3): >>><<< 7491 1727204004.87701: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204004.87716: _low_level_execute_command(): starting 7491 1727204004.87725: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204004.877061-9554-45485159322053 `" && echo ansible-tmp-1727204004.877061-9554-45485159322053="` echo /root/.ansible/tmp/ansible-tmp-1727204004.877061-9554-45485159322053 `" ) && sleep 0' 7491 1727204004.88202: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204004.88215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204004.88241: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727204004.88259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204004.88301: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204004.88313: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204004.88370: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204004.90158: stdout chunk (state=3): >>>ansible-tmp-1727204004.877061-9554-45485159322053=/root/.ansible/tmp/ansible-tmp-1727204004.877061-9554-45485159322053 <<< 7491 1727204004.90268: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204004.90328: stderr chunk (state=3): >>><<< 7491 1727204004.90331: stdout chunk (state=3): >>><<< 7491 1727204004.90347: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204004.877061-9554-45485159322053=/root/.ansible/tmp/ansible-tmp-1727204004.877061-9554-45485159322053 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204004.90393: variable 'ansible_module_compression' from source: unknown 7491 1727204004.90434: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 7491 1727204004.90476: variable 'ansible_facts' from source: unknown 7491 1727204004.90567: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204004.877061-9554-45485159322053/AnsiballZ_network_connections.py 7491 1727204004.90682: Sending initial data 7491 1727204004.90691: Sent initial data (164 bytes) 7491 1727204004.91387: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204004.91390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204004.91429: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204004.91436: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204004.91438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204004.91488: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204004.91492: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204004.91539: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204004.93196: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 7491 1727204004.93202: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727204004.93236: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727204004.93274: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpl3du03qw /root/.ansible/tmp/ansible-tmp-1727204004.877061-9554-45485159322053/AnsiballZ_network_connections.py <<< 7491 1727204004.93313: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727204004.94430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204004.94540: stderr chunk (state=3): >>><<< 7491 1727204004.94543: stdout chunk (state=3): >>><<< 7491 1727204004.94562: done transferring module to remote 7491 1727204004.94573: _low_level_execute_command(): starting 7491 1727204004.94577: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204004.877061-9554-45485159322053/ /root/.ansible/tmp/ansible-tmp-1727204004.877061-9554-45485159322053/AnsiballZ_network_connections.py && sleep 0' 7491 1727204004.95049: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204004.95053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204004.95090: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204004.95093: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204004.95095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204004.95152: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204004.95155: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204004.95157: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204004.95203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204004.96855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204004.96906: stderr chunk (state=3): >>><<< 7491 1727204004.96909: stdout chunk (state=3): >>><<< 7491 1727204004.96926: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204004.96929: _low_level_execute_command(): starting 7491 1727204004.96934: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204004.877061-9554-45485159322053/AnsiballZ_network_connections.py && sleep 0' 7491 1727204004.97390: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204004.97402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204004.97421: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727204004.97435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204004.97448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204004.97492: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204004.97511: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204004.97556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204005.25929: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_uh5ld70v/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 7491 1727204005.25934: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_uh5ld70v/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/0fe9d42b-408b-41ad-8245-f1fe5397f441: error=unknown <<< 7491 1727204005.26094: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 7491 1727204005.27667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727204005.27725: stderr chunk (state=3): >>><<< 7491 1727204005.27729: stdout chunk (state=3): >>><<< 7491 1727204005.27746: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_uh5ld70v/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_uh5ld70v/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/0fe9d42b-408b-41ad-8245-f1fe5397f441: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727204005.27777: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204004.877061-9554-45485159322053/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727204005.27788: _low_level_execute_command(): starting 7491 1727204005.27794: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204004.877061-9554-45485159322053/ > /dev/null 2>&1 && sleep 0' 7491 1727204005.28272: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204005.28276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204005.28313: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727204005.28330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204005.28381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204005.28394: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204005.28448: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204005.30207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204005.30269: stderr chunk (state=3): >>><<< 7491 1727204005.30274: stdout chunk (state=3): >>><<< 7491 1727204005.30287: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204005.30293: handler run complete 7491 1727204005.30315: attempt loop complete, returning result 7491 1727204005.30318: _execute() done 7491 1727204005.30320: dumping result to json 7491 1727204005.30330: done dumping result, returning 7491 1727204005.30338: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-0a4a-ad01-00000000011b] 7491 1727204005.30345: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000011b 7491 1727204005.30449: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000011b 7491 1727204005.30452: WORKER PROCESS EXITING changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 7491 1727204005.30550: no more pending results, returning what we have 7491 1727204005.30554: results queue empty 7491 1727204005.30555: checking for any_errors_fatal 7491 1727204005.30561: done checking for any_errors_fatal 7491 1727204005.30562: checking for max_fail_percentage 7491 1727204005.30565: done checking for max_fail_percentage 7491 1727204005.30566: checking to see if all hosts have failed and the running result is not ok 7491 1727204005.30568: done checking to see if all hosts have failed 7491 1727204005.30568: getting the remaining hosts for this loop 7491 1727204005.30570: done getting the remaining hosts for this loop 7491 1727204005.30573: getting the next task for host managed-node3 7491 1727204005.30579: done getting next task for host managed-node3 7491 1727204005.30583: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 7491 1727204005.30585: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204005.30598: getting variables 7491 1727204005.30599: in VariableManager get_vars() 7491 1727204005.30646: Calling all_inventory to load vars for managed-node3 7491 1727204005.30650: Calling groups_inventory to load vars for managed-node3 7491 1727204005.30652: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204005.30661: Calling all_plugins_play to load vars for managed-node3 7491 1727204005.30663: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204005.30672: Calling groups_plugins_play to load vars for managed-node3 7491 1727204005.31670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204005.32597: done with get_vars() 7491 1727204005.32618: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:53:25 -0400 (0:00:00.554) 0:00:47.250 ***** 7491 1727204005.32686: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 7491 1727204005.32933: worker is 1 (out of 1 available) 7491 1727204005.32948: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 7491 1727204005.32962: done queuing things up, now waiting for results queue to drain 7491 1727204005.32965: waiting for pending results... 7491 1727204005.33170: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 7491 1727204005.33274: in run() - task 0affcd87-79f5-0a4a-ad01-00000000011c 7491 1727204005.33286: variable 'ansible_search_path' from source: unknown 7491 1727204005.33293: variable 'ansible_search_path' from source: unknown 7491 1727204005.33327: calling self._execute() 7491 1727204005.33414: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204005.33421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204005.33428: variable 'omit' from source: magic vars 7491 1727204005.33713: variable 'ansible_distribution_major_version' from source: facts 7491 1727204005.33724: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204005.33819: variable 'network_state' from source: role '' defaults 7491 1727204005.33833: Evaluated conditional (network_state != {}): False 7491 1727204005.33838: when evaluation is False, skipping this task 7491 1727204005.33841: _execute() done 7491 1727204005.33843: dumping result to json 7491 1727204005.33846: done dumping result, returning 7491 1727204005.33851: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-0a4a-ad01-00000000011c] 7491 1727204005.33860: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000011c 7491 1727204005.33949: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000011c 7491 1727204005.33952: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7491 1727204005.34004: no more pending results, returning what we have 7491 1727204005.34008: results queue empty 7491 1727204005.34009: checking for any_errors_fatal 7491 1727204005.34022: done checking for any_errors_fatal 7491 1727204005.34022: checking for max_fail_percentage 7491 1727204005.34025: done checking for max_fail_percentage 7491 1727204005.34026: checking to see if all hosts have failed and the running result is not ok 7491 1727204005.34027: done checking to see if all hosts have failed 7491 1727204005.34028: getting the remaining hosts for this loop 7491 1727204005.34029: done getting the remaining hosts for this loop 7491 1727204005.34033: getting the next task for host managed-node3 7491 1727204005.34039: done getting next task for host managed-node3 7491 1727204005.34044: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7491 1727204005.34047: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204005.34073: getting variables 7491 1727204005.34075: in VariableManager get_vars() 7491 1727204005.34123: Calling all_inventory to load vars for managed-node3 7491 1727204005.34126: Calling groups_inventory to load vars for managed-node3 7491 1727204005.34128: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204005.34139: Calling all_plugins_play to load vars for managed-node3 7491 1727204005.34142: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204005.34144: Calling groups_plugins_play to load vars for managed-node3 7491 1727204005.34985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204005.36050: done with get_vars() 7491 1727204005.36070: done getting variables 7491 1727204005.36117: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:53:25 -0400 (0:00:00.034) 0:00:47.285 ***** 7491 1727204005.36145: entering _queue_task() for managed-node3/debug 7491 1727204005.36392: worker is 1 (out of 1 available) 7491 1727204005.36407: exiting _queue_task() for managed-node3/debug 7491 1727204005.36423: done queuing things up, now waiting for results queue to drain 7491 1727204005.36424: waiting for pending results... 7491 1727204005.36617: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7491 1727204005.36716: in run() - task 0affcd87-79f5-0a4a-ad01-00000000011d 7491 1727204005.36730: variable 'ansible_search_path' from source: unknown 7491 1727204005.36734: variable 'ansible_search_path' from source: unknown 7491 1727204005.36767: calling self._execute() 7491 1727204005.36848: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204005.36852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204005.36861: variable 'omit' from source: magic vars 7491 1727204005.37152: variable 'ansible_distribution_major_version' from source: facts 7491 1727204005.37162: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204005.37170: variable 'omit' from source: magic vars 7491 1727204005.37220: variable 'omit' from source: magic vars 7491 1727204005.37248: variable 'omit' from source: magic vars 7491 1727204005.37288: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727204005.37323: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727204005.37338: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727204005.37352: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204005.37361: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204005.37386: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727204005.37389: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204005.37391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204005.37468: Set connection var ansible_timeout to 10 7491 1727204005.37471: Set connection var ansible_pipelining to False 7491 1727204005.37477: Set connection var ansible_shell_type to sh 7491 1727204005.37482: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727204005.37489: Set connection var ansible_shell_executable to /bin/sh 7491 1727204005.37493: Set connection var ansible_connection to ssh 7491 1727204005.37516: variable 'ansible_shell_executable' from source: unknown 7491 1727204005.37522: variable 'ansible_connection' from source: unknown 7491 1727204005.37525: variable 'ansible_module_compression' from source: unknown 7491 1727204005.37528: variable 'ansible_shell_type' from source: unknown 7491 1727204005.37530: variable 'ansible_shell_executable' from source: unknown 7491 1727204005.37532: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204005.37534: variable 'ansible_pipelining' from source: unknown 7491 1727204005.37536: variable 'ansible_timeout' from source: unknown 7491 1727204005.37538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204005.37640: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727204005.37650: variable 'omit' from source: magic vars 7491 1727204005.37660: starting attempt loop 7491 1727204005.37663: running the handler 7491 1727204005.37765: variable '__network_connections_result' from source: set_fact 7491 1727204005.37809: handler run complete 7491 1727204005.37826: attempt loop complete, returning result 7491 1727204005.37829: _execute() done 7491 1727204005.37831: dumping result to json 7491 1727204005.37833: done dumping result, returning 7491 1727204005.37841: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-0a4a-ad01-00000000011d] 7491 1727204005.37851: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000011d 7491 1727204005.37934: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000011d 7491 1727204005.37936: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "" ] } 7491 1727204005.38045: no more pending results, returning what we have 7491 1727204005.38048: results queue empty 7491 1727204005.38049: checking for any_errors_fatal 7491 1727204005.38060: done checking for any_errors_fatal 7491 1727204005.38061: checking for max_fail_percentage 7491 1727204005.38063: done checking for max_fail_percentage 7491 1727204005.38064: checking to see if all hosts have failed and the running result is not ok 7491 1727204005.38066: done checking to see if all hosts have failed 7491 1727204005.38066: getting the remaining hosts for this loop 7491 1727204005.38068: done getting the remaining hosts for this loop 7491 1727204005.38072: getting the next task for host managed-node3 7491 1727204005.38077: done getting next task for host managed-node3 7491 1727204005.38080: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7491 1727204005.38086: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204005.38098: getting variables 7491 1727204005.38100: in VariableManager get_vars() 7491 1727204005.38143: Calling all_inventory to load vars for managed-node3 7491 1727204005.38145: Calling groups_inventory to load vars for managed-node3 7491 1727204005.38147: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204005.38156: Calling all_plugins_play to load vars for managed-node3 7491 1727204005.38158: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204005.38161: Calling groups_plugins_play to load vars for managed-node3 7491 1727204005.38985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204005.39914: done with get_vars() 7491 1727204005.39937: done getting variables 7491 1727204005.39985: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:53:25 -0400 (0:00:00.038) 0:00:47.324 ***** 7491 1727204005.40013: entering _queue_task() for managed-node3/debug 7491 1727204005.40260: worker is 1 (out of 1 available) 7491 1727204005.40275: exiting _queue_task() for managed-node3/debug 7491 1727204005.40289: done queuing things up, now waiting for results queue to drain 7491 1727204005.40291: waiting for pending results... 7491 1727204005.40494: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7491 1727204005.40595: in run() - task 0affcd87-79f5-0a4a-ad01-00000000011e 7491 1727204005.40608: variable 'ansible_search_path' from source: unknown 7491 1727204005.40612: variable 'ansible_search_path' from source: unknown 7491 1727204005.40645: calling self._execute() 7491 1727204005.40730: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204005.40735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204005.40743: variable 'omit' from source: magic vars 7491 1727204005.41028: variable 'ansible_distribution_major_version' from source: facts 7491 1727204005.41038: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204005.41045: variable 'omit' from source: magic vars 7491 1727204005.41094: variable 'omit' from source: magic vars 7491 1727204005.41124: variable 'omit' from source: magic vars 7491 1727204005.41161: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727204005.41192: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727204005.41214: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727204005.41229: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204005.41242: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204005.41268: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727204005.41277: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204005.41280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204005.41353: Set connection var ansible_timeout to 10 7491 1727204005.41359: Set connection var ansible_pipelining to False 7491 1727204005.41365: Set connection var ansible_shell_type to sh 7491 1727204005.41370: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727204005.41377: Set connection var ansible_shell_executable to /bin/sh 7491 1727204005.41382: Set connection var ansible_connection to ssh 7491 1727204005.41401: variable 'ansible_shell_executable' from source: unknown 7491 1727204005.41405: variable 'ansible_connection' from source: unknown 7491 1727204005.41408: variable 'ansible_module_compression' from source: unknown 7491 1727204005.41411: variable 'ansible_shell_type' from source: unknown 7491 1727204005.41413: variable 'ansible_shell_executable' from source: unknown 7491 1727204005.41415: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204005.41418: variable 'ansible_pipelining' from source: unknown 7491 1727204005.41424: variable 'ansible_timeout' from source: unknown 7491 1727204005.41427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204005.41528: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727204005.41538: variable 'omit' from source: magic vars 7491 1727204005.41541: starting attempt loop 7491 1727204005.41543: running the handler 7491 1727204005.41585: variable '__network_connections_result' from source: set_fact 7491 1727204005.41644: variable '__network_connections_result' from source: set_fact 7491 1727204005.41723: handler run complete 7491 1727204005.41745: attempt loop complete, returning result 7491 1727204005.41748: _execute() done 7491 1727204005.41750: dumping result to json 7491 1727204005.41754: done dumping result, returning 7491 1727204005.41763: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-0a4a-ad01-00000000011e] 7491 1727204005.41769: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000011e 7491 1727204005.41859: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000011e 7491 1727204005.41861: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 7491 1727204005.41949: no more pending results, returning what we have 7491 1727204005.41953: results queue empty 7491 1727204005.41954: checking for any_errors_fatal 7491 1727204005.41961: done checking for any_errors_fatal 7491 1727204005.41962: checking for max_fail_percentage 7491 1727204005.41965: done checking for max_fail_percentage 7491 1727204005.41967: checking to see if all hosts have failed and the running result is not ok 7491 1727204005.41968: done checking to see if all hosts have failed 7491 1727204005.41968: getting the remaining hosts for this loop 7491 1727204005.41971: done getting the remaining hosts for this loop 7491 1727204005.41974: getting the next task for host managed-node3 7491 1727204005.41980: done getting next task for host managed-node3 7491 1727204005.41983: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7491 1727204005.41986: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204005.41999: getting variables 7491 1727204005.42001: in VariableManager get_vars() 7491 1727204005.42047: Calling all_inventory to load vars for managed-node3 7491 1727204005.42050: Calling groups_inventory to load vars for managed-node3 7491 1727204005.42052: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204005.42061: Calling all_plugins_play to load vars for managed-node3 7491 1727204005.42065: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204005.42068: Calling groups_plugins_play to load vars for managed-node3 7491 1727204005.43034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204005.43954: done with get_vars() 7491 1727204005.43978: done getting variables 7491 1727204005.44026: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:53:25 -0400 (0:00:00.040) 0:00:47.364 ***** 7491 1727204005.44061: entering _queue_task() for managed-node3/debug 7491 1727204005.44308: worker is 1 (out of 1 available) 7491 1727204005.44321: exiting _queue_task() for managed-node3/debug 7491 1727204005.44335: done queuing things up, now waiting for results queue to drain 7491 1727204005.44336: waiting for pending results... 7491 1727204005.44541: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7491 1727204005.44644: in run() - task 0affcd87-79f5-0a4a-ad01-00000000011f 7491 1727204005.44657: variable 'ansible_search_path' from source: unknown 7491 1727204005.44661: variable 'ansible_search_path' from source: unknown 7491 1727204005.44692: calling self._execute() 7491 1727204005.44782: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204005.44787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204005.44798: variable 'omit' from source: magic vars 7491 1727204005.45091: variable 'ansible_distribution_major_version' from source: facts 7491 1727204005.45103: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204005.45194: variable 'network_state' from source: role '' defaults 7491 1727204005.45204: Evaluated conditional (network_state != {}): False 7491 1727204005.45207: when evaluation is False, skipping this task 7491 1727204005.45210: _execute() done 7491 1727204005.45214: dumping result to json 7491 1727204005.45217: done dumping result, returning 7491 1727204005.45226: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-0a4a-ad01-00000000011f] 7491 1727204005.45232: sending task result for task 0affcd87-79f5-0a4a-ad01-00000000011f 7491 1727204005.45328: done sending task result for task 0affcd87-79f5-0a4a-ad01-00000000011f 7491 1727204005.45331: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "network_state != {}" } 7491 1727204005.45385: no more pending results, returning what we have 7491 1727204005.45389: results queue empty 7491 1727204005.45390: checking for any_errors_fatal 7491 1727204005.45398: done checking for any_errors_fatal 7491 1727204005.45399: checking for max_fail_percentage 7491 1727204005.45401: done checking for max_fail_percentage 7491 1727204005.45402: checking to see if all hosts have failed and the running result is not ok 7491 1727204005.45404: done checking to see if all hosts have failed 7491 1727204005.45404: getting the remaining hosts for this loop 7491 1727204005.45406: done getting the remaining hosts for this loop 7491 1727204005.45410: getting the next task for host managed-node3 7491 1727204005.45418: done getting next task for host managed-node3 7491 1727204005.45422: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 7491 1727204005.45425: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204005.45449: getting variables 7491 1727204005.45451: in VariableManager get_vars() 7491 1727204005.45512: Calling all_inventory to load vars for managed-node3 7491 1727204005.45515: Calling groups_inventory to load vars for managed-node3 7491 1727204005.45517: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204005.45528: Calling all_plugins_play to load vars for managed-node3 7491 1727204005.45531: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204005.45534: Calling groups_plugins_play to load vars for managed-node3 7491 1727204005.46641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204005.47563: done with get_vars() 7491 1727204005.47586: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:53:25 -0400 (0:00:00.036) 0:00:47.400 ***** 7491 1727204005.47666: entering _queue_task() for managed-node3/ping 7491 1727204005.47913: worker is 1 (out of 1 available) 7491 1727204005.47927: exiting _queue_task() for managed-node3/ping 7491 1727204005.47941: done queuing things up, now waiting for results queue to drain 7491 1727204005.47943: waiting for pending results... 7491 1727204005.48145: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 7491 1727204005.48237: in run() - task 0affcd87-79f5-0a4a-ad01-000000000120 7491 1727204005.48249: variable 'ansible_search_path' from source: unknown 7491 1727204005.48252: variable 'ansible_search_path' from source: unknown 7491 1727204005.48285: calling self._execute() 7491 1727204005.48368: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204005.48378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204005.48388: variable 'omit' from source: magic vars 7491 1727204005.48680: variable 'ansible_distribution_major_version' from source: facts 7491 1727204005.48691: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204005.48697: variable 'omit' from source: magic vars 7491 1727204005.48749: variable 'omit' from source: magic vars 7491 1727204005.48776: variable 'omit' from source: magic vars 7491 1727204005.48816: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727204005.48851: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727204005.48872: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727204005.48885: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204005.48895: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204005.48926: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727204005.48930: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204005.48933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204005.49004: Set connection var ansible_timeout to 10 7491 1727204005.49009: Set connection var ansible_pipelining to False 7491 1727204005.49014: Set connection var ansible_shell_type to sh 7491 1727204005.49020: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727204005.49030: Set connection var ansible_shell_executable to /bin/sh 7491 1727204005.49034: Set connection var ansible_connection to ssh 7491 1727204005.49054: variable 'ansible_shell_executable' from source: unknown 7491 1727204005.49057: variable 'ansible_connection' from source: unknown 7491 1727204005.49060: variable 'ansible_module_compression' from source: unknown 7491 1727204005.49063: variable 'ansible_shell_type' from source: unknown 7491 1727204005.49066: variable 'ansible_shell_executable' from source: unknown 7491 1727204005.49069: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204005.49071: variable 'ansible_pipelining' from source: unknown 7491 1727204005.49075: variable 'ansible_timeout' from source: unknown 7491 1727204005.49079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204005.49234: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7491 1727204005.49243: variable 'omit' from source: magic vars 7491 1727204005.49250: starting attempt loop 7491 1727204005.49253: running the handler 7491 1727204005.49268: _low_level_execute_command(): starting 7491 1727204005.49279: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727204005.49814: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204005.49835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204005.49850: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204005.49861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204005.49908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204005.49924: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204005.49984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204005.51565: stdout chunk (state=3): >>>/root <<< 7491 1727204005.51667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204005.51723: stderr chunk (state=3): >>><<< 7491 1727204005.51733: stdout chunk (state=3): >>><<< 7491 1727204005.51756: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204005.51769: _low_level_execute_command(): starting 7491 1727204005.51775: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204005.5175676-9573-228088653266476 `" && echo ansible-tmp-1727204005.5175676-9573-228088653266476="` echo /root/.ansible/tmp/ansible-tmp-1727204005.5175676-9573-228088653266476 `" ) && sleep 0' 7491 1727204005.52241: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204005.52255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204005.52277: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204005.52294: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204005.52342: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204005.52354: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204005.52417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204005.54242: stdout chunk (state=3): >>>ansible-tmp-1727204005.5175676-9573-228088653266476=/root/.ansible/tmp/ansible-tmp-1727204005.5175676-9573-228088653266476 <<< 7491 1727204005.54349: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204005.54406: stderr chunk (state=3): >>><<< 7491 1727204005.54413: stdout chunk (state=3): >>><<< 7491 1727204005.54438: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204005.5175676-9573-228088653266476=/root/.ansible/tmp/ansible-tmp-1727204005.5175676-9573-228088653266476 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204005.54481: variable 'ansible_module_compression' from source: unknown 7491 1727204005.54516: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 7491 1727204005.54551: variable 'ansible_facts' from source: unknown 7491 1727204005.54608: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204005.5175676-9573-228088653266476/AnsiballZ_ping.py 7491 1727204005.54721: Sending initial data 7491 1727204005.54730: Sent initial data (151 bytes) 7491 1727204005.55437: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204005.55440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204005.55469: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727204005.55486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204005.55490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204005.55547: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204005.55550: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204005.55556: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204005.55598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204005.57304: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727204005.57344: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727204005.57393: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpe_5wluzh /root/.ansible/tmp/ansible-tmp-1727204005.5175676-9573-228088653266476/AnsiballZ_ping.py <<< 7491 1727204005.57433: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727204005.58218: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204005.58330: stderr chunk (state=3): >>><<< 7491 1727204005.58333: stdout chunk (state=3): >>><<< 7491 1727204005.58351: done transferring module to remote 7491 1727204005.58360: _low_level_execute_command(): starting 7491 1727204005.58368: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204005.5175676-9573-228088653266476/ /root/.ansible/tmp/ansible-tmp-1727204005.5175676-9573-228088653266476/AnsiballZ_ping.py && sleep 0' 7491 1727204005.58847: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204005.58853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204005.58880: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204005.58883: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204005.58886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204005.58938: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204005.58941: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204005.58989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204005.60719: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204005.60774: stderr chunk (state=3): >>><<< 7491 1727204005.60778: stdout chunk (state=3): >>><<< 7491 1727204005.60794: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204005.60805: _low_level_execute_command(): starting 7491 1727204005.60808: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204005.5175676-9573-228088653266476/AnsiballZ_ping.py && sleep 0' 7491 1727204005.61278: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204005.61291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204005.61307: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727204005.61328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204005.61377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204005.61389: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204005.61446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204005.74116: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 7491 1727204005.75037: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727204005.75103: stderr chunk (state=3): >>><<< 7491 1727204005.75107: stdout chunk (state=3): >>><<< 7491 1727204005.75129: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727204005.75151: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204005.5175676-9573-228088653266476/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727204005.75159: _low_level_execute_command(): starting 7491 1727204005.75166: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204005.5175676-9573-228088653266476/ > /dev/null 2>&1 && sleep 0' 7491 1727204005.75652: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204005.75674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204005.75688: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204005.75699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204005.75752: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204005.75772: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204005.75809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204005.77560: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204005.77624: stderr chunk (state=3): >>><<< 7491 1727204005.77628: stdout chunk (state=3): >>><<< 7491 1727204005.77642: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204005.77649: handler run complete 7491 1727204005.77663: attempt loop complete, returning result 7491 1727204005.77667: _execute() done 7491 1727204005.77669: dumping result to json 7491 1727204005.77673: done dumping result, returning 7491 1727204005.77689: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-0a4a-ad01-000000000120] 7491 1727204005.77693: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000120 7491 1727204005.77784: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000120 7491 1727204005.77787: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 7491 1727204005.77883: no more pending results, returning what we have 7491 1727204005.77893: results queue empty 7491 1727204005.77894: checking for any_errors_fatal 7491 1727204005.77903: done checking for any_errors_fatal 7491 1727204005.77904: checking for max_fail_percentage 7491 1727204005.77905: done checking for max_fail_percentage 7491 1727204005.77906: checking to see if all hosts have failed and the running result is not ok 7491 1727204005.77908: done checking to see if all hosts have failed 7491 1727204005.77908: getting the remaining hosts for this loop 7491 1727204005.77910: done getting the remaining hosts for this loop 7491 1727204005.77914: getting the next task for host managed-node3 7491 1727204005.77923: done getting next task for host managed-node3 7491 1727204005.77925: ^ task is: TASK: meta (role_complete) 7491 1727204005.77928: ^ state is: HOST STATE: block=2, task=38, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204005.77940: getting variables 7491 1727204005.77942: in VariableManager get_vars() 7491 1727204005.77996: Calling all_inventory to load vars for managed-node3 7491 1727204005.78000: Calling groups_inventory to load vars for managed-node3 7491 1727204005.78002: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204005.78011: Calling all_plugins_play to load vars for managed-node3 7491 1727204005.78013: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204005.78016: Calling groups_plugins_play to load vars for managed-node3 7491 1727204005.83433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204005.84345: done with get_vars() 7491 1727204005.84371: done getting variables 7491 1727204005.84426: done queuing things up, now waiting for results queue to drain 7491 1727204005.84428: results queue empty 7491 1727204005.84429: checking for any_errors_fatal 7491 1727204005.84431: done checking for any_errors_fatal 7491 1727204005.84432: checking for max_fail_percentage 7491 1727204005.84432: done checking for max_fail_percentage 7491 1727204005.84433: checking to see if all hosts have failed and the running result is not ok 7491 1727204005.84433: done checking to see if all hosts have failed 7491 1727204005.84434: getting the remaining hosts for this loop 7491 1727204005.84434: done getting the remaining hosts for this loop 7491 1727204005.84437: getting the next task for host managed-node3 7491 1727204005.84440: done getting next task for host managed-node3 7491 1727204005.84442: ^ task is: TASK: Include the task 'manage_test_interface.yml' 7491 1727204005.84443: ^ state is: HOST STATE: block=2, task=39, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204005.84445: getting variables 7491 1727204005.84445: in VariableManager get_vars() 7491 1727204005.84460: Calling all_inventory to load vars for managed-node3 7491 1727204005.84462: Calling groups_inventory to load vars for managed-node3 7491 1727204005.84465: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204005.84469: Calling all_plugins_play to load vars for managed-node3 7491 1727204005.84470: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204005.84472: Calling groups_plugins_play to load vars for managed-node3 7491 1727204005.85138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204005.86045: done with get_vars() 7491 1727204005.86060: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:145 Tuesday 24 September 2024 14:53:25 -0400 (0:00:00.384) 0:00:47.785 ***** 7491 1727204005.86115: entering _queue_task() for managed-node3/include_tasks 7491 1727204005.86361: worker is 1 (out of 1 available) 7491 1727204005.86375: exiting _queue_task() for managed-node3/include_tasks 7491 1727204005.86388: done queuing things up, now waiting for results queue to drain 7491 1727204005.86390: waiting for pending results... 7491 1727204005.86583: running TaskExecutor() for managed-node3/TASK: Include the task 'manage_test_interface.yml' 7491 1727204005.86669: in run() - task 0affcd87-79f5-0a4a-ad01-000000000150 7491 1727204005.86680: variable 'ansible_search_path' from source: unknown 7491 1727204005.86713: calling self._execute() 7491 1727204005.86799: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204005.86803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204005.86812: variable 'omit' from source: magic vars 7491 1727204005.87115: variable 'ansible_distribution_major_version' from source: facts 7491 1727204005.87128: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204005.87134: _execute() done 7491 1727204005.87137: dumping result to json 7491 1727204005.87139: done dumping result, returning 7491 1727204005.87146: done running TaskExecutor() for managed-node3/TASK: Include the task 'manage_test_interface.yml' [0affcd87-79f5-0a4a-ad01-000000000150] 7491 1727204005.87152: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000150 7491 1727204005.87256: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000150 7491 1727204005.87260: WORKER PROCESS EXITING 7491 1727204005.87302: no more pending results, returning what we have 7491 1727204005.87308: in VariableManager get_vars() 7491 1727204005.87370: Calling all_inventory to load vars for managed-node3 7491 1727204005.87373: Calling groups_inventory to load vars for managed-node3 7491 1727204005.87379: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204005.87392: Calling all_plugins_play to load vars for managed-node3 7491 1727204005.87395: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204005.87397: Calling groups_plugins_play to load vars for managed-node3 7491 1727204005.88362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204005.89308: done with get_vars() 7491 1727204005.89333: variable 'ansible_search_path' from source: unknown 7491 1727204005.89347: we have included files to process 7491 1727204005.89348: generating all_blocks data 7491 1727204005.89350: done generating all_blocks data 7491 1727204005.89355: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7491 1727204005.89355: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7491 1727204005.89357: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7491 1727204005.89633: in VariableManager get_vars() 7491 1727204005.89656: done with get_vars() 7491 1727204005.90091: done processing included file 7491 1727204005.90093: iterating over new_blocks loaded from include file 7491 1727204005.90094: in VariableManager get_vars() 7491 1727204005.90111: done with get_vars() 7491 1727204005.90112: filtering new block on tags 7491 1727204005.90134: done filtering new block on tags 7491 1727204005.90135: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node3 7491 1727204005.90139: extending task lists for all hosts with included blocks 7491 1727204005.93470: done extending task lists 7491 1727204005.93472: done processing included files 7491 1727204005.93473: results queue empty 7491 1727204005.93473: checking for any_errors_fatal 7491 1727204005.93474: done checking for any_errors_fatal 7491 1727204005.93475: checking for max_fail_percentage 7491 1727204005.93476: done checking for max_fail_percentage 7491 1727204005.93476: checking to see if all hosts have failed and the running result is not ok 7491 1727204005.93477: done checking to see if all hosts have failed 7491 1727204005.93477: getting the remaining hosts for this loop 7491 1727204005.93479: done getting the remaining hosts for this loop 7491 1727204005.93480: getting the next task for host managed-node3 7491 1727204005.93483: done getting next task for host managed-node3 7491 1727204005.93485: ^ task is: TASK: Ensure state in ["present", "absent"] 7491 1727204005.93486: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204005.93488: getting variables 7491 1727204005.93489: in VariableManager get_vars() 7491 1727204005.93506: Calling all_inventory to load vars for managed-node3 7491 1727204005.93508: Calling groups_inventory to load vars for managed-node3 7491 1727204005.93509: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204005.93515: Calling all_plugins_play to load vars for managed-node3 7491 1727204005.93516: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204005.93518: Calling groups_plugins_play to load vars for managed-node3 7491 1727204005.94324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204005.95263: done with get_vars() 7491 1727204005.95287: done getting variables 7491 1727204005.95325: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 14:53:25 -0400 (0:00:00.092) 0:00:47.877 ***** 7491 1727204005.95348: entering _queue_task() for managed-node3/fail 7491 1727204005.95605: worker is 1 (out of 1 available) 7491 1727204005.95621: exiting _queue_task() for managed-node3/fail 7491 1727204005.95636: done queuing things up, now waiting for results queue to drain 7491 1727204005.95637: waiting for pending results... 7491 1727204005.95833: running TaskExecutor() for managed-node3/TASK: Ensure state in ["present", "absent"] 7491 1727204005.95903: in run() - task 0affcd87-79f5-0a4a-ad01-000000001a6f 7491 1727204005.95915: variable 'ansible_search_path' from source: unknown 7491 1727204005.95921: variable 'ansible_search_path' from source: unknown 7491 1727204005.95948: calling self._execute() 7491 1727204005.96040: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204005.96043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204005.96053: variable 'omit' from source: magic vars 7491 1727204005.96347: variable 'ansible_distribution_major_version' from source: facts 7491 1727204005.96357: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204005.96462: variable 'state' from source: include params 7491 1727204005.96471: Evaluated conditional (state not in ["present", "absent"]): False 7491 1727204005.96475: when evaluation is False, skipping this task 7491 1727204005.96478: _execute() done 7491 1727204005.96480: dumping result to json 7491 1727204005.96483: done dumping result, returning 7491 1727204005.96490: done running TaskExecutor() for managed-node3/TASK: Ensure state in ["present", "absent"] [0affcd87-79f5-0a4a-ad01-000000001a6f] 7491 1727204005.96497: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001a6f 7491 1727204005.96593: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001a6f 7491 1727204005.96597: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 7491 1727204005.96671: no more pending results, returning what we have 7491 1727204005.96676: results queue empty 7491 1727204005.96677: checking for any_errors_fatal 7491 1727204005.96678: done checking for any_errors_fatal 7491 1727204005.96679: checking for max_fail_percentage 7491 1727204005.96681: done checking for max_fail_percentage 7491 1727204005.96682: checking to see if all hosts have failed and the running result is not ok 7491 1727204005.96683: done checking to see if all hosts have failed 7491 1727204005.96684: getting the remaining hosts for this loop 7491 1727204005.96686: done getting the remaining hosts for this loop 7491 1727204005.96690: getting the next task for host managed-node3 7491 1727204005.96695: done getting next task for host managed-node3 7491 1727204005.96697: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 7491 1727204005.96700: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204005.96703: getting variables 7491 1727204005.96705: in VariableManager get_vars() 7491 1727204005.96760: Calling all_inventory to load vars for managed-node3 7491 1727204005.96765: Calling groups_inventory to load vars for managed-node3 7491 1727204005.96767: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204005.96778: Calling all_plugins_play to load vars for managed-node3 7491 1727204005.96780: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204005.96783: Calling groups_plugins_play to load vars for managed-node3 7491 1727204005.97625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204005.98576: done with get_vars() 7491 1727204005.98599: done getting variables 7491 1727204005.98648: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 14:53:25 -0400 (0:00:00.033) 0:00:47.910 ***** 7491 1727204005.98677: entering _queue_task() for managed-node3/fail 7491 1727204005.98927: worker is 1 (out of 1 available) 7491 1727204005.98942: exiting _queue_task() for managed-node3/fail 7491 1727204005.98956: done queuing things up, now waiting for results queue to drain 7491 1727204005.98958: waiting for pending results... 7491 1727204005.99153: running TaskExecutor() for managed-node3/TASK: Ensure type in ["dummy", "tap", "veth"] 7491 1727204005.99230: in run() - task 0affcd87-79f5-0a4a-ad01-000000001a70 7491 1727204005.99242: variable 'ansible_search_path' from source: unknown 7491 1727204005.99245: variable 'ansible_search_path' from source: unknown 7491 1727204005.99277: calling self._execute() 7491 1727204005.99372: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204005.99376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204005.99384: variable 'omit' from source: magic vars 7491 1727204005.99684: variable 'ansible_distribution_major_version' from source: facts 7491 1727204005.99693: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204005.99799: variable 'type' from source: play vars 7491 1727204005.99804: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 7491 1727204005.99808: when evaluation is False, skipping this task 7491 1727204005.99810: _execute() done 7491 1727204005.99813: dumping result to json 7491 1727204005.99816: done dumping result, returning 7491 1727204005.99824: done running TaskExecutor() for managed-node3/TASK: Ensure type in ["dummy", "tap", "veth"] [0affcd87-79f5-0a4a-ad01-000000001a70] 7491 1727204005.99831: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001a70 7491 1727204005.99921: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001a70 7491 1727204005.99925: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 7491 1727204005.99986: no more pending results, returning what we have 7491 1727204005.99990: results queue empty 7491 1727204005.99991: checking for any_errors_fatal 7491 1727204006.00003: done checking for any_errors_fatal 7491 1727204006.00004: checking for max_fail_percentage 7491 1727204006.00005: done checking for max_fail_percentage 7491 1727204006.00006: checking to see if all hosts have failed and the running result is not ok 7491 1727204006.00007: done checking to see if all hosts have failed 7491 1727204006.00008: getting the remaining hosts for this loop 7491 1727204006.00010: done getting the remaining hosts for this loop 7491 1727204006.00014: getting the next task for host managed-node3 7491 1727204006.00022: done getting next task for host managed-node3 7491 1727204006.00025: ^ task is: TASK: Include the task 'show_interfaces.yml' 7491 1727204006.00027: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204006.00031: getting variables 7491 1727204006.00039: in VariableManager get_vars() 7491 1727204006.00088: Calling all_inventory to load vars for managed-node3 7491 1727204006.00091: Calling groups_inventory to load vars for managed-node3 7491 1727204006.00093: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204006.00103: Calling all_plugins_play to load vars for managed-node3 7491 1727204006.00106: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204006.00108: Calling groups_plugins_play to load vars for managed-node3 7491 1727204006.01044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204006.01970: done with get_vars() 7491 1727204006.01994: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 14:53:26 -0400 (0:00:00.033) 0:00:47.944 ***** 7491 1727204006.02072: entering _queue_task() for managed-node3/include_tasks 7491 1727204006.02320: worker is 1 (out of 1 available) 7491 1727204006.02331: exiting _queue_task() for managed-node3/include_tasks 7491 1727204006.02346: done queuing things up, now waiting for results queue to drain 7491 1727204006.02348: waiting for pending results... 7491 1727204006.02549: running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' 7491 1727204006.02632: in run() - task 0affcd87-79f5-0a4a-ad01-000000001a71 7491 1727204006.02648: variable 'ansible_search_path' from source: unknown 7491 1727204006.02653: variable 'ansible_search_path' from source: unknown 7491 1727204006.02683: calling self._execute() 7491 1727204006.02771: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204006.02774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204006.02784: variable 'omit' from source: magic vars 7491 1727204006.03084: variable 'ansible_distribution_major_version' from source: facts 7491 1727204006.03095: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204006.03101: _execute() done 7491 1727204006.03104: dumping result to json 7491 1727204006.03107: done dumping result, returning 7491 1727204006.03114: done running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' [0affcd87-79f5-0a4a-ad01-000000001a71] 7491 1727204006.03125: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001a71 7491 1727204006.03215: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001a71 7491 1727204006.03218: WORKER PROCESS EXITING 7491 1727204006.03249: no more pending results, returning what we have 7491 1727204006.03254: in VariableManager get_vars() 7491 1727204006.03314: Calling all_inventory to load vars for managed-node3 7491 1727204006.03318: Calling groups_inventory to load vars for managed-node3 7491 1727204006.03323: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204006.03337: Calling all_plugins_play to load vars for managed-node3 7491 1727204006.03339: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204006.03342: Calling groups_plugins_play to load vars for managed-node3 7491 1727204006.04213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204006.05278: done with get_vars() 7491 1727204006.05293: variable 'ansible_search_path' from source: unknown 7491 1727204006.05294: variable 'ansible_search_path' from source: unknown 7491 1727204006.05326: we have included files to process 7491 1727204006.05326: generating all_blocks data 7491 1727204006.05328: done generating all_blocks data 7491 1727204006.05331: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7491 1727204006.05332: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7491 1727204006.05333: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7491 1727204006.05412: in VariableManager get_vars() 7491 1727204006.05436: done with get_vars() 7491 1727204006.05523: done processing included file 7491 1727204006.05524: iterating over new_blocks loaded from include file 7491 1727204006.05525: in VariableManager get_vars() 7491 1727204006.05542: done with get_vars() 7491 1727204006.05543: filtering new block on tags 7491 1727204006.05555: done filtering new block on tags 7491 1727204006.05557: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node3 7491 1727204006.05561: extending task lists for all hosts with included blocks 7491 1727204006.05802: done extending task lists 7491 1727204006.05803: done processing included files 7491 1727204006.05804: results queue empty 7491 1727204006.05805: checking for any_errors_fatal 7491 1727204006.05807: done checking for any_errors_fatal 7491 1727204006.05807: checking for max_fail_percentage 7491 1727204006.05808: done checking for max_fail_percentage 7491 1727204006.05809: checking to see if all hosts have failed and the running result is not ok 7491 1727204006.05809: done checking to see if all hosts have failed 7491 1727204006.05810: getting the remaining hosts for this loop 7491 1727204006.05811: done getting the remaining hosts for this loop 7491 1727204006.05812: getting the next task for host managed-node3 7491 1727204006.05815: done getting next task for host managed-node3 7491 1727204006.05816: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 7491 1727204006.05820: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204006.05822: getting variables 7491 1727204006.05823: in VariableManager get_vars() 7491 1727204006.05836: Calling all_inventory to load vars for managed-node3 7491 1727204006.05838: Calling groups_inventory to load vars for managed-node3 7491 1727204006.05839: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204006.05843: Calling all_plugins_play to load vars for managed-node3 7491 1727204006.05845: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204006.05846: Calling groups_plugins_play to load vars for managed-node3 7491 1727204006.06561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204006.07485: done with get_vars() 7491 1727204006.07505: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:53:26 -0400 (0:00:00.054) 0:00:47.999 ***** 7491 1727204006.07572: entering _queue_task() for managed-node3/include_tasks 7491 1727204006.07829: worker is 1 (out of 1 available) 7491 1727204006.07843: exiting _queue_task() for managed-node3/include_tasks 7491 1727204006.07857: done queuing things up, now waiting for results queue to drain 7491 1727204006.07858: waiting for pending results... 7491 1727204006.08051: running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' 7491 1727204006.08132: in run() - task 0affcd87-79f5-0a4a-ad01-000000001d1c 7491 1727204006.08143: variable 'ansible_search_path' from source: unknown 7491 1727204006.08147: variable 'ansible_search_path' from source: unknown 7491 1727204006.08179: calling self._execute() 7491 1727204006.08263: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204006.08269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204006.08278: variable 'omit' from source: magic vars 7491 1727204006.08569: variable 'ansible_distribution_major_version' from source: facts 7491 1727204006.08580: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204006.08586: _execute() done 7491 1727204006.08589: dumping result to json 7491 1727204006.08592: done dumping result, returning 7491 1727204006.08599: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' [0affcd87-79f5-0a4a-ad01-000000001d1c] 7491 1727204006.08606: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001d1c 7491 1727204006.08696: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001d1c 7491 1727204006.08699: WORKER PROCESS EXITING 7491 1727204006.08735: no more pending results, returning what we have 7491 1727204006.08740: in VariableManager get_vars() 7491 1727204006.08801: Calling all_inventory to load vars for managed-node3 7491 1727204006.08804: Calling groups_inventory to load vars for managed-node3 7491 1727204006.08806: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204006.08827: Calling all_plugins_play to load vars for managed-node3 7491 1727204006.08830: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204006.08837: Calling groups_plugins_play to load vars for managed-node3 7491 1727204006.09858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204006.11514: done with get_vars() 7491 1727204006.11544: variable 'ansible_search_path' from source: unknown 7491 1727204006.11545: variable 'ansible_search_path' from source: unknown 7491 1727204006.11614: we have included files to process 7491 1727204006.11615: generating all_blocks data 7491 1727204006.11617: done generating all_blocks data 7491 1727204006.11619: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7491 1727204006.11620: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7491 1727204006.11622: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7491 1727204006.11904: done processing included file 7491 1727204006.11906: iterating over new_blocks loaded from include file 7491 1727204006.11908: in VariableManager get_vars() 7491 1727204006.11937: done with get_vars() 7491 1727204006.11940: filtering new block on tags 7491 1727204006.11960: done filtering new block on tags 7491 1727204006.11962: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node3 7491 1727204006.11969: extending task lists for all hosts with included blocks 7491 1727204006.12126: done extending task lists 7491 1727204006.12127: done processing included files 7491 1727204006.12128: results queue empty 7491 1727204006.12129: checking for any_errors_fatal 7491 1727204006.12133: done checking for any_errors_fatal 7491 1727204006.12134: checking for max_fail_percentage 7491 1727204006.12135: done checking for max_fail_percentage 7491 1727204006.12136: checking to see if all hosts have failed and the running result is not ok 7491 1727204006.12137: done checking to see if all hosts have failed 7491 1727204006.12137: getting the remaining hosts for this loop 7491 1727204006.12139: done getting the remaining hosts for this loop 7491 1727204006.12141: getting the next task for host managed-node3 7491 1727204006.12146: done getting next task for host managed-node3 7491 1727204006.12148: ^ task is: TASK: Gather current interface info 7491 1727204006.12152: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204006.12155: getting variables 7491 1727204006.12156: in VariableManager get_vars() 7491 1727204006.12179: Calling all_inventory to load vars for managed-node3 7491 1727204006.12181: Calling groups_inventory to load vars for managed-node3 7491 1727204006.12183: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204006.12188: Calling all_plugins_play to load vars for managed-node3 7491 1727204006.12190: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204006.12193: Calling groups_plugins_play to load vars for managed-node3 7491 1727204006.13207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204006.14115: done with get_vars() 7491 1727204006.14133: done getting variables 7491 1727204006.14168: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:53:26 -0400 (0:00:00.066) 0:00:48.065 ***** 7491 1727204006.14192: entering _queue_task() for managed-node3/command 7491 1727204006.14437: worker is 1 (out of 1 available) 7491 1727204006.14449: exiting _queue_task() for managed-node3/command 7491 1727204006.14463: done queuing things up, now waiting for results queue to drain 7491 1727204006.14466: waiting for pending results... 7491 1727204006.14666: running TaskExecutor() for managed-node3/TASK: Gather current interface info 7491 1727204006.14754: in run() - task 0affcd87-79f5-0a4a-ad01-000000001d53 7491 1727204006.14765: variable 'ansible_search_path' from source: unknown 7491 1727204006.14769: variable 'ansible_search_path' from source: unknown 7491 1727204006.14800: calling self._execute() 7491 1727204006.14887: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204006.14892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204006.14898: variable 'omit' from source: magic vars 7491 1727204006.15198: variable 'ansible_distribution_major_version' from source: facts 7491 1727204006.15208: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204006.15214: variable 'omit' from source: magic vars 7491 1727204006.15255: variable 'omit' from source: magic vars 7491 1727204006.15281: variable 'omit' from source: magic vars 7491 1727204006.15316: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727204006.15346: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727204006.15368: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727204006.15383: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204006.15391: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204006.15417: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727204006.15423: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204006.15425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204006.15498: Set connection var ansible_timeout to 10 7491 1727204006.15502: Set connection var ansible_pipelining to False 7491 1727204006.15507: Set connection var ansible_shell_type to sh 7491 1727204006.15512: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727204006.15524: Set connection var ansible_shell_executable to /bin/sh 7491 1727204006.15527: Set connection var ansible_connection to ssh 7491 1727204006.15547: variable 'ansible_shell_executable' from source: unknown 7491 1727204006.15550: variable 'ansible_connection' from source: unknown 7491 1727204006.15553: variable 'ansible_module_compression' from source: unknown 7491 1727204006.15556: variable 'ansible_shell_type' from source: unknown 7491 1727204006.15559: variable 'ansible_shell_executable' from source: unknown 7491 1727204006.15562: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204006.15564: variable 'ansible_pipelining' from source: unknown 7491 1727204006.15566: variable 'ansible_timeout' from source: unknown 7491 1727204006.15570: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204006.15675: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727204006.15687: variable 'omit' from source: magic vars 7491 1727204006.15693: starting attempt loop 7491 1727204006.15695: running the handler 7491 1727204006.15709: _low_level_execute_command(): starting 7491 1727204006.15715: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727204006.16251: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204006.16271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204006.16289: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204006.16301: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204006.16351: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204006.16366: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204006.16422: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204006.18045: stdout chunk (state=3): >>>/root <<< 7491 1727204006.18147: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204006.18205: stderr chunk (state=3): >>><<< 7491 1727204006.18213: stdout chunk (state=3): >>><<< 7491 1727204006.18236: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204006.18247: _low_level_execute_command(): starting 7491 1727204006.18253: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204006.1823688-9593-154790172682282 `" && echo ansible-tmp-1727204006.1823688-9593-154790172682282="` echo /root/.ansible/tmp/ansible-tmp-1727204006.1823688-9593-154790172682282 `" ) && sleep 0' 7491 1727204006.18722: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204006.18735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204006.18757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727204006.18773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204006.18792: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204006.18829: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204006.18842: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204006.18897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204006.20700: stdout chunk (state=3): >>>ansible-tmp-1727204006.1823688-9593-154790172682282=/root/.ansible/tmp/ansible-tmp-1727204006.1823688-9593-154790172682282 <<< 7491 1727204006.20808: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204006.20869: stderr chunk (state=3): >>><<< 7491 1727204006.20872: stdout chunk (state=3): >>><<< 7491 1727204006.20891: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204006.1823688-9593-154790172682282=/root/.ansible/tmp/ansible-tmp-1727204006.1823688-9593-154790172682282 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204006.20921: variable 'ansible_module_compression' from source: unknown 7491 1727204006.20967: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7491 1727204006.20998: variable 'ansible_facts' from source: unknown 7491 1727204006.21047: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204006.1823688-9593-154790172682282/AnsiballZ_command.py 7491 1727204006.21157: Sending initial data 7491 1727204006.21170: Sent initial data (154 bytes) 7491 1727204006.21870: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204006.21874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204006.21912: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727204006.21915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204006.21923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727204006.21925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204006.21979: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204006.21982: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204006.21990: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204006.22028: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204006.23690: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727204006.23724: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727204006.23766: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpzaki63ag /root/.ansible/tmp/ansible-tmp-1727204006.1823688-9593-154790172682282/AnsiballZ_command.py <<< 7491 1727204006.23803: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727204006.24601: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204006.24715: stderr chunk (state=3): >>><<< 7491 1727204006.24719: stdout chunk (state=3): >>><<< 7491 1727204006.24740: done transferring module to remote 7491 1727204006.24749: _low_level_execute_command(): starting 7491 1727204006.24754: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204006.1823688-9593-154790172682282/ /root/.ansible/tmp/ansible-tmp-1727204006.1823688-9593-154790172682282/AnsiballZ_command.py && sleep 0' 7491 1727204006.25224: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204006.25238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204006.25249: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204006.25260: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204006.25278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204006.25321: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204006.25339: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204006.25380: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204006.27030: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204006.27089: stderr chunk (state=3): >>><<< 7491 1727204006.27093: stdout chunk (state=3): >>><<< 7491 1727204006.27113: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204006.27121: _low_level_execute_command(): starting 7491 1727204006.27124: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204006.1823688-9593-154790172682282/AnsiballZ_command.py && sleep 0' 7491 1727204006.27591: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204006.27606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204006.27621: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727204006.27633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727204006.27649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204006.27692: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204006.27708: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204006.27753: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204006.40951: stdout chunk (state=3): >>> {"changed": true, "stdout": "eth0\nlo\npeerveth0\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:53:26.405578", "end": "2024-09-24 14:53:26.408680", "delta": "0:00:00.003102", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7491 1727204006.42073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727204006.42135: stderr chunk (state=3): >>><<< 7491 1727204006.42139: stdout chunk (state=3): >>><<< 7491 1727204006.42157: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "eth0\nlo\npeerveth0\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:53:26.405578", "end": "2024-09-24 14:53:26.408680", "delta": "0:00:00.003102", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727204006.42193: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204006.1823688-9593-154790172682282/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727204006.42203: _low_level_execute_command(): starting 7491 1727204006.42207: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204006.1823688-9593-154790172682282/ > /dev/null 2>&1 && sleep 0' 7491 1727204006.42683: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204006.42698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204006.42719: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727204006.42734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204006.42784: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204006.42796: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204006.42851: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204006.44590: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204006.44642: stderr chunk (state=3): >>><<< 7491 1727204006.44646: stdout chunk (state=3): >>><<< 7491 1727204006.44660: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204006.44669: handler run complete 7491 1727204006.44688: Evaluated conditional (False): False 7491 1727204006.44697: attempt loop complete, returning result 7491 1727204006.44700: _execute() done 7491 1727204006.44702: dumping result to json 7491 1727204006.44710: done dumping result, returning 7491 1727204006.44717: done running TaskExecutor() for managed-node3/TASK: Gather current interface info [0affcd87-79f5-0a4a-ad01-000000001d53] 7491 1727204006.44726: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001d53 7491 1727204006.44826: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001d53 7491 1727204006.44829: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003102", "end": "2024-09-24 14:53:26.408680", "rc": 0, "start": "2024-09-24 14:53:26.405578" } STDOUT: eth0 lo peerveth0 veth0 7491 1727204006.44901: no more pending results, returning what we have 7491 1727204006.44905: results queue empty 7491 1727204006.44906: checking for any_errors_fatal 7491 1727204006.44907: done checking for any_errors_fatal 7491 1727204006.44908: checking for max_fail_percentage 7491 1727204006.44909: done checking for max_fail_percentage 7491 1727204006.44910: checking to see if all hosts have failed and the running result is not ok 7491 1727204006.44911: done checking to see if all hosts have failed 7491 1727204006.44912: getting the remaining hosts for this loop 7491 1727204006.44914: done getting the remaining hosts for this loop 7491 1727204006.44918: getting the next task for host managed-node3 7491 1727204006.44925: done getting next task for host managed-node3 7491 1727204006.44927: ^ task is: TASK: Set current_interfaces 7491 1727204006.44933: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204006.44939: getting variables 7491 1727204006.44941: in VariableManager get_vars() 7491 1727204006.44992: Calling all_inventory to load vars for managed-node3 7491 1727204006.44995: Calling groups_inventory to load vars for managed-node3 7491 1727204006.44997: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204006.45007: Calling all_plugins_play to load vars for managed-node3 7491 1727204006.45010: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204006.45012: Calling groups_plugins_play to load vars for managed-node3 7491 1727204006.46306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204006.48003: done with get_vars() 7491 1727204006.48035: done getting variables 7491 1727204006.48098: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:53:26 -0400 (0:00:00.339) 0:00:48.405 ***** 7491 1727204006.48136: entering _queue_task() for managed-node3/set_fact 7491 1727204006.48468: worker is 1 (out of 1 available) 7491 1727204006.48478: exiting _queue_task() for managed-node3/set_fact 7491 1727204006.48493: done queuing things up, now waiting for results queue to drain 7491 1727204006.48495: waiting for pending results... 7491 1727204006.48803: running TaskExecutor() for managed-node3/TASK: Set current_interfaces 7491 1727204006.48942: in run() - task 0affcd87-79f5-0a4a-ad01-000000001d54 7491 1727204006.48969: variable 'ansible_search_path' from source: unknown 7491 1727204006.48978: variable 'ansible_search_path' from source: unknown 7491 1727204006.49022: calling self._execute() 7491 1727204006.49138: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204006.49151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204006.49169: variable 'omit' from source: magic vars 7491 1727204006.49649: variable 'ansible_distribution_major_version' from source: facts 7491 1727204006.49669: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204006.49680: variable 'omit' from source: magic vars 7491 1727204006.49751: variable 'omit' from source: magic vars 7491 1727204006.49877: variable '_current_interfaces' from source: set_fact 7491 1727204006.49955: variable 'omit' from source: magic vars 7491 1727204006.50003: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727204006.50047: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727204006.50076: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727204006.50097: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204006.50113: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204006.50152: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727204006.50160: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204006.50170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204006.50282: Set connection var ansible_timeout to 10 7491 1727204006.50294: Set connection var ansible_pipelining to False 7491 1727204006.50302: Set connection var ansible_shell_type to sh 7491 1727204006.50312: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727204006.50325: Set connection var ansible_shell_executable to /bin/sh 7491 1727204006.50333: Set connection var ansible_connection to ssh 7491 1727204006.50363: variable 'ansible_shell_executable' from source: unknown 7491 1727204006.50373: variable 'ansible_connection' from source: unknown 7491 1727204006.50379: variable 'ansible_module_compression' from source: unknown 7491 1727204006.50385: variable 'ansible_shell_type' from source: unknown 7491 1727204006.50391: variable 'ansible_shell_executable' from source: unknown 7491 1727204006.50397: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204006.50405: variable 'ansible_pipelining' from source: unknown 7491 1727204006.50411: variable 'ansible_timeout' from source: unknown 7491 1727204006.50421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204006.50565: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727204006.50585: variable 'omit' from source: magic vars 7491 1727204006.50594: starting attempt loop 7491 1727204006.50600: running the handler 7491 1727204006.50614: handler run complete 7491 1727204006.50632: attempt loop complete, returning result 7491 1727204006.50639: _execute() done 7491 1727204006.50644: dumping result to json 7491 1727204006.50651: done dumping result, returning 7491 1727204006.50661: done running TaskExecutor() for managed-node3/TASK: Set current_interfaces [0affcd87-79f5-0a4a-ad01-000000001d54] 7491 1727204006.50673: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001d54 7491 1727204006.50779: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001d54 7491 1727204006.50787: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "current_interfaces": [ "eth0", "lo", "peerveth0", "veth0" ] }, "changed": false } 7491 1727204006.50855: no more pending results, returning what we have 7491 1727204006.50859: results queue empty 7491 1727204006.50860: checking for any_errors_fatal 7491 1727204006.50872: done checking for any_errors_fatal 7491 1727204006.50873: checking for max_fail_percentage 7491 1727204006.50875: done checking for max_fail_percentage 7491 1727204006.50876: checking to see if all hosts have failed and the running result is not ok 7491 1727204006.50877: done checking to see if all hosts have failed 7491 1727204006.50877: getting the remaining hosts for this loop 7491 1727204006.50880: done getting the remaining hosts for this loop 7491 1727204006.50884: getting the next task for host managed-node3 7491 1727204006.50893: done getting next task for host managed-node3 7491 1727204006.50896: ^ task is: TASK: Show current_interfaces 7491 1727204006.50900: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204006.50905: getting variables 7491 1727204006.50907: in VariableManager get_vars() 7491 1727204006.50967: Calling all_inventory to load vars for managed-node3 7491 1727204006.50970: Calling groups_inventory to load vars for managed-node3 7491 1727204006.50972: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204006.50986: Calling all_plugins_play to load vars for managed-node3 7491 1727204006.50988: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204006.50991: Calling groups_plugins_play to load vars for managed-node3 7491 1727204006.52779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204006.54873: done with get_vars() 7491 1727204006.54902: done getting variables 7491 1727204006.54967: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:53:26 -0400 (0:00:00.068) 0:00:48.473 ***** 7491 1727204006.55001: entering _queue_task() for managed-node3/debug 7491 1727204006.56121: worker is 1 (out of 1 available) 7491 1727204006.56136: exiting _queue_task() for managed-node3/debug 7491 1727204006.56149: done queuing things up, now waiting for results queue to drain 7491 1727204006.56151: waiting for pending results... 7491 1727204006.56876: running TaskExecutor() for managed-node3/TASK: Show current_interfaces 7491 1727204006.57016: in run() - task 0affcd87-79f5-0a4a-ad01-000000001d1d 7491 1727204006.57040: variable 'ansible_search_path' from source: unknown 7491 1727204006.57049: variable 'ansible_search_path' from source: unknown 7491 1727204006.57094: calling self._execute() 7491 1727204006.57230: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204006.57242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204006.57257: variable 'omit' from source: magic vars 7491 1727204006.57697: variable 'ansible_distribution_major_version' from source: facts 7491 1727204006.57717: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204006.57730: variable 'omit' from source: magic vars 7491 1727204006.57796: variable 'omit' from source: magic vars 7491 1727204006.57909: variable 'current_interfaces' from source: set_fact 7491 1727204006.57947: variable 'omit' from source: magic vars 7491 1727204006.58004: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727204006.58047: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727204006.58083: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727204006.58113: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204006.58131: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204006.58170: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727204006.58179: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204006.58191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204006.58310: Set connection var ansible_timeout to 10 7491 1727204006.58324: Set connection var ansible_pipelining to False 7491 1727204006.58334: Set connection var ansible_shell_type to sh 7491 1727204006.58344: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727204006.58356: Set connection var ansible_shell_executable to /bin/sh 7491 1727204006.58369: Set connection var ansible_connection to ssh 7491 1727204006.58396: variable 'ansible_shell_executable' from source: unknown 7491 1727204006.58413: variable 'ansible_connection' from source: unknown 7491 1727204006.58425: variable 'ansible_module_compression' from source: unknown 7491 1727204006.58432: variable 'ansible_shell_type' from source: unknown 7491 1727204006.58438: variable 'ansible_shell_executable' from source: unknown 7491 1727204006.58445: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204006.58452: variable 'ansible_pipelining' from source: unknown 7491 1727204006.58459: variable 'ansible_timeout' from source: unknown 7491 1727204006.58469: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204006.58622: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727204006.58645: variable 'omit' from source: magic vars 7491 1727204006.58654: starting attempt loop 7491 1727204006.58660: running the handler 7491 1727204006.58708: handler run complete 7491 1727204006.58725: attempt loop complete, returning result 7491 1727204006.58731: _execute() done 7491 1727204006.58741: dumping result to json 7491 1727204006.58753: done dumping result, returning 7491 1727204006.58767: done running TaskExecutor() for managed-node3/TASK: Show current_interfaces [0affcd87-79f5-0a4a-ad01-000000001d1d] 7491 1727204006.58778: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001d1d ok: [managed-node3] => {} MSG: current_interfaces: ['eth0', 'lo', 'peerveth0', 'veth0'] 7491 1727204006.58933: no more pending results, returning what we have 7491 1727204006.58937: results queue empty 7491 1727204006.58938: checking for any_errors_fatal 7491 1727204006.58945: done checking for any_errors_fatal 7491 1727204006.58946: checking for max_fail_percentage 7491 1727204006.58947: done checking for max_fail_percentage 7491 1727204006.58948: checking to see if all hosts have failed and the running result is not ok 7491 1727204006.58949: done checking to see if all hosts have failed 7491 1727204006.58950: getting the remaining hosts for this loop 7491 1727204006.58952: done getting the remaining hosts for this loop 7491 1727204006.58956: getting the next task for host managed-node3 7491 1727204006.58966: done getting next task for host managed-node3 7491 1727204006.58968: ^ task is: TASK: Install iproute 7491 1727204006.58971: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204006.58975: getting variables 7491 1727204006.58977: in VariableManager get_vars() 7491 1727204006.59033: Calling all_inventory to load vars for managed-node3 7491 1727204006.59036: Calling groups_inventory to load vars for managed-node3 7491 1727204006.59038: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204006.59051: Calling all_plugins_play to load vars for managed-node3 7491 1727204006.59053: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204006.59057: Calling groups_plugins_play to load vars for managed-node3 7491 1727204006.59771: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001d1d 7491 1727204006.59774: WORKER PROCESS EXITING 7491 1727204006.61411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204006.64587: done with get_vars() 7491 1727204006.64624: done getting variables 7491 1727204006.65332: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 14:53:26 -0400 (0:00:00.103) 0:00:48.577 ***** 7491 1727204006.65369: entering _queue_task() for managed-node3/package 7491 1727204006.65709: worker is 1 (out of 1 available) 7491 1727204006.65723: exiting _queue_task() for managed-node3/package 7491 1727204006.65737: done queuing things up, now waiting for results queue to drain 7491 1727204006.65739: waiting for pending results... 7491 1727204006.66040: running TaskExecutor() for managed-node3/TASK: Install iproute 7491 1727204006.66155: in run() - task 0affcd87-79f5-0a4a-ad01-000000001a72 7491 1727204006.66181: variable 'ansible_search_path' from source: unknown 7491 1727204006.66189: variable 'ansible_search_path' from source: unknown 7491 1727204006.66232: calling self._execute() 7491 1727204006.66344: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204006.66356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204006.66371: variable 'omit' from source: magic vars 7491 1727204006.66768: variable 'ansible_distribution_major_version' from source: facts 7491 1727204006.66786: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204006.66798: variable 'omit' from source: magic vars 7491 1727204006.66851: variable 'omit' from source: magic vars 7491 1727204006.67067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7491 1727204006.69541: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7491 1727204006.69617: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7491 1727204006.69668: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7491 1727204006.69708: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7491 1727204006.69742: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7491 1727204006.69849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7491 1727204006.70251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7491 1727204006.70283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7491 1727204006.70335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7491 1727204006.70353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7491 1727204006.70469: variable '__network_is_ostree' from source: set_fact 7491 1727204006.70480: variable 'omit' from source: magic vars 7491 1727204006.70513: variable 'omit' from source: magic vars 7491 1727204006.70553: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727204006.70587: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727204006.70611: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727204006.70639: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204006.70654: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204006.70691: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727204006.70700: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204006.70707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204006.70823: Set connection var ansible_timeout to 10 7491 1727204006.70837: Set connection var ansible_pipelining to False 7491 1727204006.70850: Set connection var ansible_shell_type to sh 7491 1727204006.70862: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727204006.70876: Set connection var ansible_shell_executable to /bin/sh 7491 1727204006.70885: Set connection var ansible_connection to ssh 7491 1727204006.70911: variable 'ansible_shell_executable' from source: unknown 7491 1727204006.70917: variable 'ansible_connection' from source: unknown 7491 1727204006.70928: variable 'ansible_module_compression' from source: unknown 7491 1727204006.70934: variable 'ansible_shell_type' from source: unknown 7491 1727204006.70941: variable 'ansible_shell_executable' from source: unknown 7491 1727204006.70947: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204006.70955: variable 'ansible_pipelining' from source: unknown 7491 1727204006.70966: variable 'ansible_timeout' from source: unknown 7491 1727204006.70973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204006.71077: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727204006.71093: variable 'omit' from source: magic vars 7491 1727204006.71102: starting attempt loop 7491 1727204006.71108: running the handler 7491 1727204006.71118: variable 'ansible_facts' from source: unknown 7491 1727204006.71127: variable 'ansible_facts' from source: unknown 7491 1727204006.71163: _low_level_execute_command(): starting 7491 1727204006.71178: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727204006.71914: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727204006.71935: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204006.71957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204006.71981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204006.72027: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204006.72038: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727204006.72053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204006.72071: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727204006.72081: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727204006.72089: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727204006.72099: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204006.72111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204006.72126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204006.72137: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204006.72145: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727204006.72163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204006.72244: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204006.72281: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204006.72300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204006.72390: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204006.73978: stdout chunk (state=3): >>>/root <<< 7491 1727204006.74182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204006.74186: stdout chunk (state=3): >>><<< 7491 1727204006.74188: stderr chunk (state=3): >>><<< 7491 1727204006.74310: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204006.74313: _low_level_execute_command(): starting 7491 1727204006.74317: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204006.7421012-9607-14456096321932 `" && echo ansible-tmp-1727204006.7421012-9607-14456096321932="` echo /root/.ansible/tmp/ansible-tmp-1727204006.7421012-9607-14456096321932 `" ) && sleep 0' 7491 1727204006.75873: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727204006.75890: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204006.75907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204006.75930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204006.75979: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204006.75992: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727204006.76007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204006.76029: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727204006.76044: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727204006.76059: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727204006.76074: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204006.76089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204006.76105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204006.76117: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204006.76171: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727204006.76187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204006.76264: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204006.76401: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204006.76421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204006.76502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204006.78331: stdout chunk (state=3): >>>ansible-tmp-1727204006.7421012-9607-14456096321932=/root/.ansible/tmp/ansible-tmp-1727204006.7421012-9607-14456096321932 <<< 7491 1727204006.78544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204006.78548: stdout chunk (state=3): >>><<< 7491 1727204006.78550: stderr chunk (state=3): >>><<< 7491 1727204006.78872: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204006.7421012-9607-14456096321932=/root/.ansible/tmp/ansible-tmp-1727204006.7421012-9607-14456096321932 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204006.78876: variable 'ansible_module_compression' from source: unknown 7491 1727204006.78878: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 7491 1727204006.78880: variable 'ansible_facts' from source: unknown 7491 1727204006.78882: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204006.7421012-9607-14456096321932/AnsiballZ_dnf.py 7491 1727204006.79553: Sending initial data 7491 1727204006.79556: Sent initial data (149 bytes) 7491 1727204006.82489: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727204006.83287: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204006.83298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204006.83312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204006.83353: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204006.83360: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727204006.83372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204006.83386: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727204006.83392: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727204006.83399: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727204006.83406: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204006.83414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204006.83425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204006.83432: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204006.83438: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727204006.83447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204006.83523: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204006.83542: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204006.83557: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204006.83631: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204006.85323: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727204006.85359: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727204006.85407: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmp1v_8sg2m /root/.ansible/tmp/ansible-tmp-1727204006.7421012-9607-14456096321932/AnsiballZ_dnf.py <<< 7491 1727204006.85446: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727204006.87196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204006.87381: stderr chunk (state=3): >>><<< 7491 1727204006.87385: stdout chunk (state=3): >>><<< 7491 1727204006.87387: done transferring module to remote 7491 1727204006.87390: _low_level_execute_command(): starting 7491 1727204006.87393: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204006.7421012-9607-14456096321932/ /root/.ansible/tmp/ansible-tmp-1727204006.7421012-9607-14456096321932/AnsiballZ_dnf.py && sleep 0' 7491 1727204006.88839: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204006.88843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204006.88869: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727204006.88874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204006.88885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204006.89058: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204006.89129: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204006.89132: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204006.89192: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204006.90894: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204006.90972: stderr chunk (state=3): >>><<< 7491 1727204006.90976: stdout chunk (state=3): >>><<< 7491 1727204006.91077: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204006.91080: _low_level_execute_command(): starting 7491 1727204006.91083: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204006.7421012-9607-14456096321932/AnsiballZ_dnf.py && sleep 0' 7491 1727204006.92532: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727204006.92597: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204006.92618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204006.92666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204006.92711: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204006.92754: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727204006.92771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204006.92809: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727204006.92823: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727204006.92841: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727204006.92853: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204006.92909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204006.92927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204006.92947: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204006.92959: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727204006.92977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204006.93059: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204006.93183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204006.93199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204006.93312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204007.84435: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 7491 1727204007.88390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727204007.88440: stderr chunk (state=3): >>><<< 7491 1727204007.88444: stdout chunk (state=3): >>><<< 7491 1727204007.88463: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727204007.88501: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204006.7421012-9607-14456096321932/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727204007.88506: _low_level_execute_command(): starting 7491 1727204007.88511: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204006.7421012-9607-14456096321932/ > /dev/null 2>&1 && sleep 0' 7491 1727204007.88958: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204007.88961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204007.88995: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204007.88998: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204007.89000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204007.89077: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204007.89082: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204007.89084: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204007.89149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204007.90980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204007.90985: stdout chunk (state=3): >>><<< 7491 1727204007.90990: stderr chunk (state=3): >>><<< 7491 1727204007.91009: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204007.91016: handler run complete 7491 1727204007.91186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7491 1727204007.91368: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7491 1727204007.91412: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7491 1727204007.91442: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7491 1727204007.91472: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7491 1727204007.91549: variable '__install_status' from source: set_fact 7491 1727204007.91571: Evaluated conditional (__install_status is success): True 7491 1727204007.91589: attempt loop complete, returning result 7491 1727204007.91592: _execute() done 7491 1727204007.91594: dumping result to json 7491 1727204007.91601: done dumping result, returning 7491 1727204007.91727: done running TaskExecutor() for managed-node3/TASK: Install iproute [0affcd87-79f5-0a4a-ad01-000000001a72] 7491 1727204007.91732: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001a72 ok: [managed-node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 7491 1727204007.91934: no more pending results, returning what we have 7491 1727204007.91938: results queue empty 7491 1727204007.91939: checking for any_errors_fatal 7491 1727204007.91946: done checking for any_errors_fatal 7491 1727204007.91947: checking for max_fail_percentage 7491 1727204007.91948: done checking for max_fail_percentage 7491 1727204007.91949: checking to see if all hosts have failed and the running result is not ok 7491 1727204007.91956: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001a72 7491 1727204007.91959: done checking to see if all hosts have failed 7491 1727204007.91967: WORKER PROCESS EXITING 7491 1727204007.91969: getting the remaining hosts for this loop 7491 1727204007.91975: done getting the remaining hosts for this loop 7491 1727204007.91980: getting the next task for host managed-node3 7491 1727204007.91986: done getting next task for host managed-node3 7491 1727204007.91989: ^ task is: TASK: Create veth interface {{ interface }} 7491 1727204007.91992: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204007.91996: getting variables 7491 1727204007.91999: in VariableManager get_vars() 7491 1727204007.92051: Calling all_inventory to load vars for managed-node3 7491 1727204007.92054: Calling groups_inventory to load vars for managed-node3 7491 1727204007.92057: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204007.92071: Calling all_plugins_play to load vars for managed-node3 7491 1727204007.92074: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204007.92077: Calling groups_plugins_play to load vars for managed-node3 7491 1727204007.95547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204007.97950: done with get_vars() 7491 1727204007.97991: done getting variables 7491 1727204007.98057: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727204007.98193: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 14:53:27 -0400 (0:00:01.328) 0:00:49.906 ***** 7491 1727204007.98232: entering _queue_task() for managed-node3/command 7491 1727204007.98628: worker is 1 (out of 1 available) 7491 1727204007.98646: exiting _queue_task() for managed-node3/command 7491 1727204007.98662: done queuing things up, now waiting for results queue to drain 7491 1727204007.98665: waiting for pending results... 7491 1727204007.98988: running TaskExecutor() for managed-node3/TASK: Create veth interface veth0 7491 1727204007.99110: in run() - task 0affcd87-79f5-0a4a-ad01-000000001a73 7491 1727204007.99134: variable 'ansible_search_path' from source: unknown 7491 1727204007.99141: variable 'ansible_search_path' from source: unknown 7491 1727204007.99444: variable 'interface' from source: play vars 7491 1727204007.99540: variable 'interface' from source: play vars 7491 1727204007.99628: variable 'interface' from source: play vars 7491 1727204007.99800: Loaded config def from plugin (lookup/items) 7491 1727204007.99814: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 7491 1727204007.99846: variable 'omit' from source: magic vars 7491 1727204008.00008: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204008.00021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204008.00036: variable 'omit' from source: magic vars 7491 1727204008.00296: variable 'ansible_distribution_major_version' from source: facts 7491 1727204008.00314: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204008.00540: variable 'type' from source: play vars 7491 1727204008.00550: variable 'state' from source: include params 7491 1727204008.00559: variable 'interface' from source: play vars 7491 1727204008.00570: variable 'current_interfaces' from source: set_fact 7491 1727204008.00581: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 7491 1727204008.00588: when evaluation is False, skipping this task 7491 1727204008.00630: variable 'item' from source: unknown 7491 1727204008.00698: variable 'item' from source: unknown skipping: [managed-node3] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link add veth0 type veth peer name peerveth0", "skip_reason": "Conditional result was False" } 7491 1727204008.00933: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204008.00950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204008.00966: variable 'omit' from source: magic vars 7491 1727204008.01128: variable 'ansible_distribution_major_version' from source: facts 7491 1727204008.01139: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204008.01334: variable 'type' from source: play vars 7491 1727204008.01344: variable 'state' from source: include params 7491 1727204008.01352: variable 'interface' from source: play vars 7491 1727204008.01359: variable 'current_interfaces' from source: set_fact 7491 1727204008.01372: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 7491 1727204008.01379: when evaluation is False, skipping this task 7491 1727204008.01410: variable 'item' from source: unknown 7491 1727204008.01480: variable 'item' from source: unknown skipping: [managed-node3] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set peerveth0 up", "skip_reason": "Conditional result was False" } 7491 1727204008.01626: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204008.01642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204008.01655: variable 'omit' from source: magic vars 7491 1727204008.01819: variable 'ansible_distribution_major_version' from source: facts 7491 1727204008.01829: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204008.02025: variable 'type' from source: play vars 7491 1727204008.02035: variable 'state' from source: include params 7491 1727204008.02042: variable 'interface' from source: play vars 7491 1727204008.02049: variable 'current_interfaces' from source: set_fact 7491 1727204008.02058: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 7491 1727204008.02067: when evaluation is False, skipping this task 7491 1727204008.02098: variable 'item' from source: unknown 7491 1727204008.02169: variable 'item' from source: unknown skipping: [managed-node3] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set veth0 up", "skip_reason": "Conditional result was False" } 7491 1727204008.02266: dumping result to json 7491 1727204008.02277: done dumping result, returning 7491 1727204008.02286: done running TaskExecutor() for managed-node3/TASK: Create veth interface veth0 [0affcd87-79f5-0a4a-ad01-000000001a73] 7491 1727204008.02297: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001a73 skipping: [managed-node3] => { "changed": false } MSG: All items skipped 7491 1727204008.02399: no more pending results, returning what we have 7491 1727204008.02403: results queue empty 7491 1727204008.02404: checking for any_errors_fatal 7491 1727204008.02415: done checking for any_errors_fatal 7491 1727204008.02416: checking for max_fail_percentage 7491 1727204008.02418: done checking for max_fail_percentage 7491 1727204008.02419: checking to see if all hosts have failed and the running result is not ok 7491 1727204008.02420: done checking to see if all hosts have failed 7491 1727204008.02421: getting the remaining hosts for this loop 7491 1727204008.02423: done getting the remaining hosts for this loop 7491 1727204008.02427: getting the next task for host managed-node3 7491 1727204008.02434: done getting next task for host managed-node3 7491 1727204008.02436: ^ task is: TASK: Set up veth as managed by NetworkManager 7491 1727204008.02439: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204008.02444: getting variables 7491 1727204008.02446: in VariableManager get_vars() 7491 1727204008.02504: Calling all_inventory to load vars for managed-node3 7491 1727204008.02507: Calling groups_inventory to load vars for managed-node3 7491 1727204008.02510: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204008.02523: Calling all_plugins_play to load vars for managed-node3 7491 1727204008.02526: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204008.02529: Calling groups_plugins_play to load vars for managed-node3 7491 1727204008.03553: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001a73 7491 1727204008.03557: WORKER PROCESS EXITING 7491 1727204008.04343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204008.06085: done with get_vars() 7491 1727204008.06115: done getting variables 7491 1727204008.06184: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 14:53:28 -0400 (0:00:00.079) 0:00:49.986 ***** 7491 1727204008.06221: entering _queue_task() for managed-node3/command 7491 1727204008.06541: worker is 1 (out of 1 available) 7491 1727204008.06554: exiting _queue_task() for managed-node3/command 7491 1727204008.06569: done queuing things up, now waiting for results queue to drain 7491 1727204008.06570: waiting for pending results... 7491 1727204008.06866: running TaskExecutor() for managed-node3/TASK: Set up veth as managed by NetworkManager 7491 1727204008.06983: in run() - task 0affcd87-79f5-0a4a-ad01-000000001a74 7491 1727204008.07008: variable 'ansible_search_path' from source: unknown 7491 1727204008.07024: variable 'ansible_search_path' from source: unknown 7491 1727204008.07067: calling self._execute() 7491 1727204008.07186: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204008.07200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204008.07214: variable 'omit' from source: magic vars 7491 1727204008.07610: variable 'ansible_distribution_major_version' from source: facts 7491 1727204008.07628: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204008.07806: variable 'type' from source: play vars 7491 1727204008.07816: variable 'state' from source: include params 7491 1727204008.07825: Evaluated conditional (type == 'veth' and state == 'present'): False 7491 1727204008.07832: when evaluation is False, skipping this task 7491 1727204008.07838: _execute() done 7491 1727204008.07844: dumping result to json 7491 1727204008.07851: done dumping result, returning 7491 1727204008.07859: done running TaskExecutor() for managed-node3/TASK: Set up veth as managed by NetworkManager [0affcd87-79f5-0a4a-ad01-000000001a74] 7491 1727204008.07872: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001a74 7491 1727204008.07977: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001a74 skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'present'", "skip_reason": "Conditional result was False" } 7491 1727204008.08034: no more pending results, returning what we have 7491 1727204008.08039: results queue empty 7491 1727204008.08040: checking for any_errors_fatal 7491 1727204008.08052: done checking for any_errors_fatal 7491 1727204008.08053: checking for max_fail_percentage 7491 1727204008.08055: done checking for max_fail_percentage 7491 1727204008.08056: checking to see if all hosts have failed and the running result is not ok 7491 1727204008.08057: done checking to see if all hosts have failed 7491 1727204008.08058: getting the remaining hosts for this loop 7491 1727204008.08060: done getting the remaining hosts for this loop 7491 1727204008.08066: getting the next task for host managed-node3 7491 1727204008.08072: done getting next task for host managed-node3 7491 1727204008.08075: ^ task is: TASK: Delete veth interface {{ interface }} 7491 1727204008.08078: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204008.08083: getting variables 7491 1727204008.08085: in VariableManager get_vars() 7491 1727204008.08142: Calling all_inventory to load vars for managed-node3 7491 1727204008.08145: Calling groups_inventory to load vars for managed-node3 7491 1727204008.08147: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204008.08162: Calling all_plugins_play to load vars for managed-node3 7491 1727204008.08167: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204008.08170: Calling groups_plugins_play to load vars for managed-node3 7491 1727204008.09216: WORKER PROCESS EXITING 7491 1727204008.11399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204008.15228: done with get_vars() 7491 1727204008.15270: done getting variables 7491 1727204008.15334: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727204008.15459: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 14:53:28 -0400 (0:00:00.092) 0:00:50.078 ***** 7491 1727204008.15498: entering _queue_task() for managed-node3/command 7491 1727204008.16747: worker is 1 (out of 1 available) 7491 1727204008.16761: exiting _queue_task() for managed-node3/command 7491 1727204008.16778: done queuing things up, now waiting for results queue to drain 7491 1727204008.16780: waiting for pending results... 7491 1727204008.17271: running TaskExecutor() for managed-node3/TASK: Delete veth interface veth0 7491 1727204008.17503: in run() - task 0affcd87-79f5-0a4a-ad01-000000001a75 7491 1727204008.17528: variable 'ansible_search_path' from source: unknown 7491 1727204008.17537: variable 'ansible_search_path' from source: unknown 7491 1727204008.17584: calling self._execute() 7491 1727204008.17876: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204008.17889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204008.17905: variable 'omit' from source: magic vars 7491 1727204008.18669: variable 'ansible_distribution_major_version' from source: facts 7491 1727204008.18753: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204008.19186: variable 'type' from source: play vars 7491 1727204008.19196: variable 'state' from source: include params 7491 1727204008.19205: variable 'interface' from source: play vars 7491 1727204008.19212: variable 'current_interfaces' from source: set_fact 7491 1727204008.19223: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): True 7491 1727204008.19235: variable 'omit' from source: magic vars 7491 1727204008.19396: variable 'omit' from source: magic vars 7491 1727204008.19608: variable 'interface' from source: play vars 7491 1727204008.19632: variable 'omit' from source: magic vars 7491 1727204008.19683: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727204008.19835: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727204008.19862: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727204008.19886: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204008.19903: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204008.20018: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727204008.20045: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204008.20075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204008.20301: Set connection var ansible_timeout to 10 7491 1727204008.20314: Set connection var ansible_pipelining to False 7491 1727204008.20376: Set connection var ansible_shell_type to sh 7491 1727204008.20389: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727204008.20402: Set connection var ansible_shell_executable to /bin/sh 7491 1727204008.20412: Set connection var ansible_connection to ssh 7491 1727204008.20442: variable 'ansible_shell_executable' from source: unknown 7491 1727204008.20477: variable 'ansible_connection' from source: unknown 7491 1727204008.20485: variable 'ansible_module_compression' from source: unknown 7491 1727204008.20581: variable 'ansible_shell_type' from source: unknown 7491 1727204008.20591: variable 'ansible_shell_executable' from source: unknown 7491 1727204008.20598: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204008.20606: variable 'ansible_pipelining' from source: unknown 7491 1727204008.20614: variable 'ansible_timeout' from source: unknown 7491 1727204008.20622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204008.20884: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727204008.20903: variable 'omit' from source: magic vars 7491 1727204008.21028: starting attempt loop 7491 1727204008.21035: running the handler 7491 1727204008.21057: _low_level_execute_command(): starting 7491 1727204008.21073: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727204008.23013: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727204008.23033: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204008.23049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204008.23073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204008.23121: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204008.23134: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727204008.23149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204008.23170: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727204008.23184: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727204008.23200: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727204008.23213: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204008.23228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204008.23244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204008.23258: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204008.23272: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727204008.23287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204008.23369: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204008.23539: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204008.23554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204008.23645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204008.25257: stdout chunk (state=3): >>>/root <<< 7491 1727204008.25454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204008.25457: stdout chunk (state=3): >>><<< 7491 1727204008.25460: stderr chunk (state=3): >>><<< 7491 1727204008.25585: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204008.25595: _low_level_execute_command(): starting 7491 1727204008.25599: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204008.2548523-9665-22087334915954 `" && echo ansible-tmp-1727204008.2548523-9665-22087334915954="` echo /root/.ansible/tmp/ansible-tmp-1727204008.2548523-9665-22087334915954 `" ) && sleep 0' 7491 1727204008.27814: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727204008.27825: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204008.27834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204008.27848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204008.27889: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204008.28080: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727204008.28089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204008.28102: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727204008.28109: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727204008.28117: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727204008.28125: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204008.28132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204008.28143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204008.28150: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204008.28156: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727204008.28168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204008.28237: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204008.28254: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204008.28267: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204008.28337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204008.30163: stdout chunk (state=3): >>>ansible-tmp-1727204008.2548523-9665-22087334915954=/root/.ansible/tmp/ansible-tmp-1727204008.2548523-9665-22087334915954 <<< 7491 1727204008.30279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204008.30359: stderr chunk (state=3): >>><<< 7491 1727204008.30362: stdout chunk (state=3): >>><<< 7491 1727204008.30383: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204008.2548523-9665-22087334915954=/root/.ansible/tmp/ansible-tmp-1727204008.2548523-9665-22087334915954 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204008.30416: variable 'ansible_module_compression' from source: unknown 7491 1727204008.30471: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7491 1727204008.30505: variable 'ansible_facts' from source: unknown 7491 1727204008.30590: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204008.2548523-9665-22087334915954/AnsiballZ_command.py 7491 1727204008.31205: Sending initial data 7491 1727204008.31209: Sent initial data (153 bytes) 7491 1727204008.33871: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727204008.33875: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204008.33878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204008.33880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204008.33972: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204008.34081: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727204008.34084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204008.34087: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727204008.34089: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727204008.34091: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727204008.34092: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204008.34094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204008.34096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204008.34098: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204008.34100: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727204008.34102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204008.34140: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204008.34317: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204008.34330: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204008.34397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204008.36074: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727204008.36117: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727204008.36160: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmphp7q1leh /root/.ansible/tmp/ansible-tmp-1727204008.2548523-9665-22087334915954/AnsiballZ_command.py <<< 7491 1727204008.36201: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727204008.37474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204008.37750: stderr chunk (state=3): >>><<< 7491 1727204008.37753: stdout chunk (state=3): >>><<< 7491 1727204008.37756: done transferring module to remote 7491 1727204008.37758: _low_level_execute_command(): starting 7491 1727204008.37761: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204008.2548523-9665-22087334915954/ /root/.ansible/tmp/ansible-tmp-1727204008.2548523-9665-22087334915954/AnsiballZ_command.py && sleep 0' 7491 1727204008.39335: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7491 1727204008.39356: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204008.39377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204008.39402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204008.39453: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204008.39494: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727204008.39512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204008.39536: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7491 1727204008.39549: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 7491 1727204008.39560: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7491 1727204008.39576: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204008.39592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204008.39613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204008.39631: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204008.39644: stderr chunk (state=3): >>>debug2: match found <<< 7491 1727204008.39658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204008.39739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204008.39760: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204008.39778: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204008.39951: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204008.41672: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204008.41676: stdout chunk (state=3): >>><<< 7491 1727204008.41678: stderr chunk (state=3): >>><<< 7491 1727204008.41780: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204008.41783: _low_level_execute_command(): starting 7491 1727204008.41786: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204008.2548523-9665-22087334915954/AnsiballZ_command.py && sleep 0' 7491 1727204008.43243: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204008.43247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204008.43281: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727204008.43286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204008.43288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204008.43348: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204008.43488: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204008.43491: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204008.43552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204008.57849: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-24 14:53:28.565482", "end": "2024-09-24 14:53:28.577318", "delta": "0:00:00.011836", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}<<< 7491 1727204008.57854: stdout chunk (state=3): >>> <<< 7491 1727204008.59451: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727204008.59455: stdout chunk (state=3): >>><<< 7491 1727204008.59460: stderr chunk (state=3): >>><<< 7491 1727204008.59486: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-24 14:53:28.565482", "end": "2024-09-24 14:53:28.577318", "delta": "0:00:00.011836", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727204008.59529: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del veth0 type veth', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204008.2548523-9665-22087334915954/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727204008.59537: _low_level_execute_command(): starting 7491 1727204008.59542: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204008.2548523-9665-22087334915954/ > /dev/null 2>&1 && sleep 0' 7491 1727204008.61100: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204008.61104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204008.61208: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727204008.61214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 7491 1727204008.61231: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204008.61237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204008.61436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204008.61455: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204008.61608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204008.63343: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204008.63346: stderr chunk (state=3): >>><<< 7491 1727204008.63349: stdout chunk (state=3): >>><<< 7491 1727204008.63370: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204008.63377: handler run complete 7491 1727204008.63399: Evaluated conditional (False): False 7491 1727204008.63408: attempt loop complete, returning result 7491 1727204008.63410: _execute() done 7491 1727204008.63413: dumping result to json 7491 1727204008.63418: done dumping result, returning 7491 1727204008.63430: done running TaskExecutor() for managed-node3/TASK: Delete veth interface veth0 [0affcd87-79f5-0a4a-ad01-000000001a75] 7491 1727204008.63436: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001a75 7491 1727204008.63541: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001a75 7491 1727204008.63543: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "link", "del", "veth0", "type", "veth" ], "delta": "0:00:00.011836", "end": "2024-09-24 14:53:28.577318", "rc": 0, "start": "2024-09-24 14:53:28.565482" } 7491 1727204008.63627: no more pending results, returning what we have 7491 1727204008.63631: results queue empty 7491 1727204008.63632: checking for any_errors_fatal 7491 1727204008.63639: done checking for any_errors_fatal 7491 1727204008.63640: checking for max_fail_percentage 7491 1727204008.63642: done checking for max_fail_percentage 7491 1727204008.63643: checking to see if all hosts have failed and the running result is not ok 7491 1727204008.63644: done checking to see if all hosts have failed 7491 1727204008.63645: getting the remaining hosts for this loop 7491 1727204008.63646: done getting the remaining hosts for this loop 7491 1727204008.63650: getting the next task for host managed-node3 7491 1727204008.63655: done getting next task for host managed-node3 7491 1727204008.63657: ^ task is: TASK: Create dummy interface {{ interface }} 7491 1727204008.63660: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204008.63666: getting variables 7491 1727204008.63668: in VariableManager get_vars() 7491 1727204008.63718: Calling all_inventory to load vars for managed-node3 7491 1727204008.63721: Calling groups_inventory to load vars for managed-node3 7491 1727204008.63723: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204008.63733: Calling all_plugins_play to load vars for managed-node3 7491 1727204008.63735: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204008.63738: Calling groups_plugins_play to load vars for managed-node3 7491 1727204008.65305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204008.67633: done with get_vars() 7491 1727204008.67663: done getting variables 7491 1727204008.67723: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727204008.67840: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 14:53:28 -0400 (0:00:00.523) 0:00:50.602 ***** 7491 1727204008.67881: entering _queue_task() for managed-node3/command 7491 1727204008.68296: worker is 1 (out of 1 available) 7491 1727204008.68342: exiting _queue_task() for managed-node3/command 7491 1727204008.68355: done queuing things up, now waiting for results queue to drain 7491 1727204008.68357: waiting for pending results... 7491 1727204008.68659: running TaskExecutor() for managed-node3/TASK: Create dummy interface veth0 7491 1727204008.68776: in run() - task 0affcd87-79f5-0a4a-ad01-000000001a76 7491 1727204008.68820: variable 'ansible_search_path' from source: unknown 7491 1727204008.68846: variable 'ansible_search_path' from source: unknown 7491 1727204008.68895: calling self._execute() 7491 1727204008.69015: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204008.69030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204008.69040: variable 'omit' from source: magic vars 7491 1727204008.69452: variable 'ansible_distribution_major_version' from source: facts 7491 1727204008.69469: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204008.69705: variable 'type' from source: play vars 7491 1727204008.69708: variable 'state' from source: include params 7491 1727204008.69714: variable 'interface' from source: play vars 7491 1727204008.69717: variable 'current_interfaces' from source: set_fact 7491 1727204008.69730: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 7491 1727204008.69733: when evaluation is False, skipping this task 7491 1727204008.69736: _execute() done 7491 1727204008.69738: dumping result to json 7491 1727204008.69749: done dumping result, returning 7491 1727204008.69754: done running TaskExecutor() for managed-node3/TASK: Create dummy interface veth0 [0affcd87-79f5-0a4a-ad01-000000001a76] 7491 1727204008.69769: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001a76 7491 1727204008.69857: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001a76 7491 1727204008.69860: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7491 1727204008.69944: no more pending results, returning what we have 7491 1727204008.69948: results queue empty 7491 1727204008.69950: checking for any_errors_fatal 7491 1727204008.69968: done checking for any_errors_fatal 7491 1727204008.69970: checking for max_fail_percentage 7491 1727204008.69972: done checking for max_fail_percentage 7491 1727204008.69974: checking to see if all hosts have failed and the running result is not ok 7491 1727204008.69975: done checking to see if all hosts have failed 7491 1727204008.69976: getting the remaining hosts for this loop 7491 1727204008.69978: done getting the remaining hosts for this loop 7491 1727204008.69982: getting the next task for host managed-node3 7491 1727204008.69989: done getting next task for host managed-node3 7491 1727204008.69992: ^ task is: TASK: Delete dummy interface {{ interface }} 7491 1727204008.69995: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204008.70000: getting variables 7491 1727204008.70002: in VariableManager get_vars() 7491 1727204008.70060: Calling all_inventory to load vars for managed-node3 7491 1727204008.70071: Calling groups_inventory to load vars for managed-node3 7491 1727204008.70075: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204008.70090: Calling all_plugins_play to load vars for managed-node3 7491 1727204008.70094: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204008.70097: Calling groups_plugins_play to load vars for managed-node3 7491 1727204008.72378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204008.74474: done with get_vars() 7491 1727204008.74494: done getting variables 7491 1727204008.74538: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727204008.74622: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 14:53:28 -0400 (0:00:00.067) 0:00:50.670 ***** 7491 1727204008.74646: entering _queue_task() for managed-node3/command 7491 1727204008.74880: worker is 1 (out of 1 available) 7491 1727204008.74891: exiting _queue_task() for managed-node3/command 7491 1727204008.74906: done queuing things up, now waiting for results queue to drain 7491 1727204008.74907: waiting for pending results... 7491 1727204008.75094: running TaskExecutor() for managed-node3/TASK: Delete dummy interface veth0 7491 1727204008.75175: in run() - task 0affcd87-79f5-0a4a-ad01-000000001a77 7491 1727204008.75187: variable 'ansible_search_path' from source: unknown 7491 1727204008.75191: variable 'ansible_search_path' from source: unknown 7491 1727204008.75227: calling self._execute() 7491 1727204008.75325: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204008.75329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204008.75338: variable 'omit' from source: magic vars 7491 1727204008.75742: variable 'ansible_distribution_major_version' from source: facts 7491 1727204008.75771: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204008.76743: variable 'type' from source: play vars 7491 1727204008.76746: variable 'state' from source: include params 7491 1727204008.76750: variable 'interface' from source: play vars 7491 1727204008.76752: variable 'current_interfaces' from source: set_fact 7491 1727204008.76754: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 7491 1727204008.76756: when evaluation is False, skipping this task 7491 1727204008.76758: _execute() done 7491 1727204008.76760: dumping result to json 7491 1727204008.76762: done dumping result, returning 7491 1727204008.76765: done running TaskExecutor() for managed-node3/TASK: Delete dummy interface veth0 [0affcd87-79f5-0a4a-ad01-000000001a77] 7491 1727204008.76768: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001a77 7491 1727204008.76976: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001a77 7491 1727204008.76979: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7491 1727204008.77017: no more pending results, returning what we have 7491 1727204008.77020: results queue empty 7491 1727204008.77021: checking for any_errors_fatal 7491 1727204008.77027: done checking for any_errors_fatal 7491 1727204008.77028: checking for max_fail_percentage 7491 1727204008.77030: done checking for max_fail_percentage 7491 1727204008.77031: checking to see if all hosts have failed and the running result is not ok 7491 1727204008.77032: done checking to see if all hosts have failed 7491 1727204008.77034: getting the remaining hosts for this loop 7491 1727204008.77036: done getting the remaining hosts for this loop 7491 1727204008.77040: getting the next task for host managed-node3 7491 1727204008.77045: done getting next task for host managed-node3 7491 1727204008.77048: ^ task is: TASK: Create tap interface {{ interface }} 7491 1727204008.77051: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204008.77055: getting variables 7491 1727204008.77056: in VariableManager get_vars() 7491 1727204008.77102: Calling all_inventory to load vars for managed-node3 7491 1727204008.77105: Calling groups_inventory to load vars for managed-node3 7491 1727204008.77107: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204008.77117: Calling all_plugins_play to load vars for managed-node3 7491 1727204008.77119: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204008.77122: Calling groups_plugins_play to load vars for managed-node3 7491 1727204008.78615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204008.85813: done with get_vars() 7491 1727204008.85835: done getting variables 7491 1727204008.85875: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727204008.85944: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 14:53:28 -0400 (0:00:00.113) 0:00:50.783 ***** 7491 1727204008.85971: entering _queue_task() for managed-node3/command 7491 1727204008.86204: worker is 1 (out of 1 available) 7491 1727204008.86217: exiting _queue_task() for managed-node3/command 7491 1727204008.86231: done queuing things up, now waiting for results queue to drain 7491 1727204008.86233: waiting for pending results... 7491 1727204008.86430: running TaskExecutor() for managed-node3/TASK: Create tap interface veth0 7491 1727204008.86512: in run() - task 0affcd87-79f5-0a4a-ad01-000000001a78 7491 1727204008.86521: variable 'ansible_search_path' from source: unknown 7491 1727204008.86525: variable 'ansible_search_path' from source: unknown 7491 1727204008.86557: calling self._execute() 7491 1727204008.86643: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204008.86647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204008.86656: variable 'omit' from source: magic vars 7491 1727204008.87022: variable 'ansible_distribution_major_version' from source: facts 7491 1727204008.87041: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204008.87289: variable 'type' from source: play vars 7491 1727204008.87299: variable 'state' from source: include params 7491 1727204008.87307: variable 'interface' from source: play vars 7491 1727204008.87315: variable 'current_interfaces' from source: set_fact 7491 1727204008.87326: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 7491 1727204008.87333: when evaluation is False, skipping this task 7491 1727204008.87338: _execute() done 7491 1727204008.87345: dumping result to json 7491 1727204008.87352: done dumping result, returning 7491 1727204008.87360: done running TaskExecutor() for managed-node3/TASK: Create tap interface veth0 [0affcd87-79f5-0a4a-ad01-000000001a78] 7491 1727204008.87373: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001a78 7491 1727204008.87482: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001a78 7491 1727204008.87497: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7491 1727204008.87547: no more pending results, returning what we have 7491 1727204008.87551: results queue empty 7491 1727204008.87552: checking for any_errors_fatal 7491 1727204008.87558: done checking for any_errors_fatal 7491 1727204008.87559: checking for max_fail_percentage 7491 1727204008.87561: done checking for max_fail_percentage 7491 1727204008.87562: checking to see if all hosts have failed and the running result is not ok 7491 1727204008.87563: done checking to see if all hosts have failed 7491 1727204008.87566: getting the remaining hosts for this loop 7491 1727204008.87568: done getting the remaining hosts for this loop 7491 1727204008.87572: getting the next task for host managed-node3 7491 1727204008.87579: done getting next task for host managed-node3 7491 1727204008.87582: ^ task is: TASK: Delete tap interface {{ interface }} 7491 1727204008.87585: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204008.87589: getting variables 7491 1727204008.87591: in VariableManager get_vars() 7491 1727204008.87649: Calling all_inventory to load vars for managed-node3 7491 1727204008.87653: Calling groups_inventory to load vars for managed-node3 7491 1727204008.87655: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204008.87672: Calling all_plugins_play to load vars for managed-node3 7491 1727204008.87675: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204008.87679: Calling groups_plugins_play to load vars for managed-node3 7491 1727204008.88802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204008.89732: done with get_vars() 7491 1727204008.89750: done getting variables 7491 1727204008.89796: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7491 1727204008.89885: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 14:53:28 -0400 (0:00:00.039) 0:00:50.822 ***** 7491 1727204008.89909: entering _queue_task() for managed-node3/command 7491 1727204008.90137: worker is 1 (out of 1 available) 7491 1727204008.90151: exiting _queue_task() for managed-node3/command 7491 1727204008.90167: done queuing things up, now waiting for results queue to drain 7491 1727204008.90168: waiting for pending results... 7491 1727204008.90358: running TaskExecutor() for managed-node3/TASK: Delete tap interface veth0 7491 1727204008.90445: in run() - task 0affcd87-79f5-0a4a-ad01-000000001a79 7491 1727204008.90455: variable 'ansible_search_path' from source: unknown 7491 1727204008.90458: variable 'ansible_search_path' from source: unknown 7491 1727204008.90492: calling self._execute() 7491 1727204008.90576: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204008.90580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204008.90588: variable 'omit' from source: magic vars 7491 1727204008.90875: variable 'ansible_distribution_major_version' from source: facts 7491 1727204008.90885: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204008.91031: variable 'type' from source: play vars 7491 1727204008.91035: variable 'state' from source: include params 7491 1727204008.91038: variable 'interface' from source: play vars 7491 1727204008.91042: variable 'current_interfaces' from source: set_fact 7491 1727204008.91051: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 7491 1727204008.91055: when evaluation is False, skipping this task 7491 1727204008.91057: _execute() done 7491 1727204008.91060: dumping result to json 7491 1727204008.91063: done dumping result, returning 7491 1727204008.91070: done running TaskExecutor() for managed-node3/TASK: Delete tap interface veth0 [0affcd87-79f5-0a4a-ad01-000000001a79] 7491 1727204008.91076: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001a79 7491 1727204008.91162: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001a79 7491 1727204008.91166: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7491 1727204008.91216: no more pending results, returning what we have 7491 1727204008.91222: results queue empty 7491 1727204008.91224: checking for any_errors_fatal 7491 1727204008.91229: done checking for any_errors_fatal 7491 1727204008.91230: checking for max_fail_percentage 7491 1727204008.91232: done checking for max_fail_percentage 7491 1727204008.91233: checking to see if all hosts have failed and the running result is not ok 7491 1727204008.91234: done checking to see if all hosts have failed 7491 1727204008.91234: getting the remaining hosts for this loop 7491 1727204008.91236: done getting the remaining hosts for this loop 7491 1727204008.91240: getting the next task for host managed-node3 7491 1727204008.91247: done getting next task for host managed-node3 7491 1727204008.91250: ^ task is: TASK: Verify network state restored to default 7491 1727204008.91251: ^ state is: HOST STATE: block=2, task=41, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204008.91255: getting variables 7491 1727204008.91256: in VariableManager get_vars() 7491 1727204008.91309: Calling all_inventory to load vars for managed-node3 7491 1727204008.91312: Calling groups_inventory to load vars for managed-node3 7491 1727204008.91314: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204008.91327: Calling all_plugins_play to load vars for managed-node3 7491 1727204008.91329: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204008.91332: Calling groups_plugins_play to load vars for managed-node3 7491 1727204008.92291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204008.93223: done with get_vars() 7491 1727204008.93242: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:149 Tuesday 24 September 2024 14:53:28 -0400 (0:00:00.034) 0:00:50.857 ***** 7491 1727204008.93313: entering _queue_task() for managed-node3/include_tasks 7491 1727204008.93550: worker is 1 (out of 1 available) 7491 1727204008.93565: exiting _queue_task() for managed-node3/include_tasks 7491 1727204008.93579: done queuing things up, now waiting for results queue to drain 7491 1727204008.93581: waiting for pending results... 7491 1727204008.93765: running TaskExecutor() for managed-node3/TASK: Verify network state restored to default 7491 1727204008.93827: in run() - task 0affcd87-79f5-0a4a-ad01-000000000151 7491 1727204008.93838: variable 'ansible_search_path' from source: unknown 7491 1727204008.93868: calling self._execute() 7491 1727204008.93952: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204008.93958: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204008.93967: variable 'omit' from source: magic vars 7491 1727204008.94251: variable 'ansible_distribution_major_version' from source: facts 7491 1727204008.94261: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204008.94268: _execute() done 7491 1727204008.94271: dumping result to json 7491 1727204008.94275: done dumping result, returning 7491 1727204008.94281: done running TaskExecutor() for managed-node3/TASK: Verify network state restored to default [0affcd87-79f5-0a4a-ad01-000000000151] 7491 1727204008.94288: sending task result for task 0affcd87-79f5-0a4a-ad01-000000000151 7491 1727204008.94378: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000000151 7491 1727204008.94382: WORKER PROCESS EXITING 7491 1727204008.94413: no more pending results, returning what we have 7491 1727204008.94417: in VariableManager get_vars() 7491 1727204008.94480: Calling all_inventory to load vars for managed-node3 7491 1727204008.94483: Calling groups_inventory to load vars for managed-node3 7491 1727204008.94485: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204008.94503: Calling all_plugins_play to load vars for managed-node3 7491 1727204008.94506: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204008.94508: Calling groups_plugins_play to load vars for managed-node3 7491 1727204008.95331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204008.96259: done with get_vars() 7491 1727204008.96275: variable 'ansible_search_path' from source: unknown 7491 1727204008.96287: we have included files to process 7491 1727204008.96287: generating all_blocks data 7491 1727204008.96289: done generating all_blocks data 7491 1727204008.96294: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 7491 1727204008.96295: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 7491 1727204008.96296: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 7491 1727204008.96578: done processing included file 7491 1727204008.96580: iterating over new_blocks loaded from include file 7491 1727204008.96581: in VariableManager get_vars() 7491 1727204008.96597: done with get_vars() 7491 1727204008.96598: filtering new block on tags 7491 1727204008.96610: done filtering new block on tags 7491 1727204008.96611: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node3 7491 1727204008.96615: extending task lists for all hosts with included blocks 7491 1727204009.00179: done extending task lists 7491 1727204009.00181: done processing included files 7491 1727204009.00181: results queue empty 7491 1727204009.00182: checking for any_errors_fatal 7491 1727204009.00184: done checking for any_errors_fatal 7491 1727204009.00185: checking for max_fail_percentage 7491 1727204009.00185: done checking for max_fail_percentage 7491 1727204009.00186: checking to see if all hosts have failed and the running result is not ok 7491 1727204009.00187: done checking to see if all hosts have failed 7491 1727204009.00187: getting the remaining hosts for this loop 7491 1727204009.00188: done getting the remaining hosts for this loop 7491 1727204009.00190: getting the next task for host managed-node3 7491 1727204009.00193: done getting next task for host managed-node3 7491 1727204009.00194: ^ task is: TASK: Check routes and DNS 7491 1727204009.00196: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204009.00197: getting variables 7491 1727204009.00198: in VariableManager get_vars() 7491 1727204009.00213: Calling all_inventory to load vars for managed-node3 7491 1727204009.00214: Calling groups_inventory to load vars for managed-node3 7491 1727204009.00216: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204009.00223: Calling all_plugins_play to load vars for managed-node3 7491 1727204009.00224: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204009.00226: Calling groups_plugins_play to load vars for managed-node3 7491 1727204009.00935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204009.01851: done with get_vars() 7491 1727204009.01870: done getting variables 7491 1727204009.01903: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 14:53:29 -0400 (0:00:00.086) 0:00:50.943 ***** 7491 1727204009.01926: entering _queue_task() for managed-node3/shell 7491 1727204009.02175: worker is 1 (out of 1 available) 7491 1727204009.02189: exiting _queue_task() for managed-node3/shell 7491 1727204009.02201: done queuing things up, now waiting for results queue to drain 7491 1727204009.02203: waiting for pending results... 7491 1727204009.02388: running TaskExecutor() for managed-node3/TASK: Check routes and DNS 7491 1727204009.02450: in run() - task 0affcd87-79f5-0a4a-ad01-000000001d93 7491 1727204009.02460: variable 'ansible_search_path' from source: unknown 7491 1727204009.02466: variable 'ansible_search_path' from source: unknown 7491 1727204009.02493: calling self._execute() 7491 1727204009.02577: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204009.02581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204009.02589: variable 'omit' from source: magic vars 7491 1727204009.02878: variable 'ansible_distribution_major_version' from source: facts 7491 1727204009.02888: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204009.02895: variable 'omit' from source: magic vars 7491 1727204009.02925: variable 'omit' from source: magic vars 7491 1727204009.02947: variable 'omit' from source: magic vars 7491 1727204009.02984: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727204009.03010: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727204009.03027: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727204009.03041: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204009.03051: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204009.03079: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727204009.03082: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204009.03086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204009.03152: Set connection var ansible_timeout to 10 7491 1727204009.03157: Set connection var ansible_pipelining to False 7491 1727204009.03162: Set connection var ansible_shell_type to sh 7491 1727204009.03170: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727204009.03178: Set connection var ansible_shell_executable to /bin/sh 7491 1727204009.03180: Set connection var ansible_connection to ssh 7491 1727204009.03202: variable 'ansible_shell_executable' from source: unknown 7491 1727204009.03205: variable 'ansible_connection' from source: unknown 7491 1727204009.03208: variable 'ansible_module_compression' from source: unknown 7491 1727204009.03211: variable 'ansible_shell_type' from source: unknown 7491 1727204009.03213: variable 'ansible_shell_executable' from source: unknown 7491 1727204009.03215: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204009.03217: variable 'ansible_pipelining' from source: unknown 7491 1727204009.03222: variable 'ansible_timeout' from source: unknown 7491 1727204009.03225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204009.03327: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727204009.03336: variable 'omit' from source: magic vars 7491 1727204009.03342: starting attempt loop 7491 1727204009.03345: running the handler 7491 1727204009.03353: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727204009.03372: _low_level_execute_command(): starting 7491 1727204009.03379: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727204009.03917: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204009.03942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204009.03966: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204009.03978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204009.04015: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204009.04032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204009.04085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204009.05706: stdout chunk (state=3): >>>/root <<< 7491 1727204009.05809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204009.05863: stderr chunk (state=3): >>><<< 7491 1727204009.05868: stdout chunk (state=3): >>><<< 7491 1727204009.05893: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204009.05904: _low_level_execute_command(): starting 7491 1727204009.05910: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204009.0589359-9704-90619433828494 `" && echo ansible-tmp-1727204009.0589359-9704-90619433828494="` echo /root/.ansible/tmp/ansible-tmp-1727204009.0589359-9704-90619433828494 `" ) && sleep 0' 7491 1727204009.06362: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204009.06370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204009.06403: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204009.06415: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204009.06430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204009.06478: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204009.06490: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204009.06541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204009.08353: stdout chunk (state=3): >>>ansible-tmp-1727204009.0589359-9704-90619433828494=/root/.ansible/tmp/ansible-tmp-1727204009.0589359-9704-90619433828494 <<< 7491 1727204009.08462: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204009.08518: stderr chunk (state=3): >>><<< 7491 1727204009.08524: stdout chunk (state=3): >>><<< 7491 1727204009.08538: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204009.0589359-9704-90619433828494=/root/.ansible/tmp/ansible-tmp-1727204009.0589359-9704-90619433828494 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204009.08569: variable 'ansible_module_compression' from source: unknown 7491 1727204009.08618: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7491 1727204009.08649: variable 'ansible_facts' from source: unknown 7491 1727204009.08713: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204009.0589359-9704-90619433828494/AnsiballZ_command.py 7491 1727204009.08827: Sending initial data 7491 1727204009.08831: Sent initial data (153 bytes) 7491 1727204009.09518: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204009.09528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204009.09559: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204009.09575: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204009.09586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204009.09631: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204009.09643: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204009.09697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204009.11365: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727204009.11399: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727204009.11440: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpxtpytxo1 /root/.ansible/tmp/ansible-tmp-1727204009.0589359-9704-90619433828494/AnsiballZ_command.py <<< 7491 1727204009.11477: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727204009.12263: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204009.12376: stderr chunk (state=3): >>><<< 7491 1727204009.12380: stdout chunk (state=3): >>><<< 7491 1727204009.12396: done transferring module to remote 7491 1727204009.12406: _low_level_execute_command(): starting 7491 1727204009.12411: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204009.0589359-9704-90619433828494/ /root/.ansible/tmp/ansible-tmp-1727204009.0589359-9704-90619433828494/AnsiballZ_command.py && sleep 0' 7491 1727204009.12879: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204009.12886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204009.12915: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727204009.12929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727204009.12939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204009.12993: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204009.12998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204009.13048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204009.14707: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204009.14759: stderr chunk (state=3): >>><<< 7491 1727204009.14762: stdout chunk (state=3): >>><<< 7491 1727204009.14779: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204009.14782: _low_level_execute_command(): starting 7491 1727204009.14789: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204009.0589359-9704-90619433828494/AnsiballZ_command.py && sleep 0' 7491 1727204009.15242: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204009.15248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204009.15282: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204009.15294: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204009.15351: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204009.15370: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204009.15410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204009.29156: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:f5:d7:be:93 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.15.87/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3258sec preferred_lft 3258sec\n inet6 fe80::8ff:f5ff:fed7:be93/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.87 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.87 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:53:29.282390", "end": "2024-09-24 14:53:29.290431", "delta": "0:00:00.008041", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7491 1727204009.30256: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727204009.30312: stderr chunk (state=3): >>><<< 7491 1727204009.30316: stdout chunk (state=3): >>><<< 7491 1727204009.30339: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:f5:d7:be:93 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.15.87/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3258sec preferred_lft 3258sec\n inet6 fe80::8ff:f5ff:fed7:be93/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.87 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.87 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:53:29.282390", "end": "2024-09-24 14:53:29.290431", "delta": "0:00:00.008041", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727204009.30381: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204009.0589359-9704-90619433828494/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727204009.30388: _low_level_execute_command(): starting 7491 1727204009.30394: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204009.0589359-9704-90619433828494/ > /dev/null 2>&1 && sleep 0' 7491 1727204009.30868: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204009.30880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204009.30910: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727204009.30926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204009.30976: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204009.30989: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204009.31044: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204009.32771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204009.32828: stderr chunk (state=3): >>><<< 7491 1727204009.32832: stdout chunk (state=3): >>><<< 7491 1727204009.32844: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204009.32851: handler run complete 7491 1727204009.32874: Evaluated conditional (False): False 7491 1727204009.32882: attempt loop complete, returning result 7491 1727204009.32885: _execute() done 7491 1727204009.32887: dumping result to json 7491 1727204009.32893: done dumping result, returning 7491 1727204009.32901: done running TaskExecutor() for managed-node3/TASK: Check routes and DNS [0affcd87-79f5-0a4a-ad01-000000001d93] 7491 1727204009.32909: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001d93 7491 1727204009.33014: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001d93 7491 1727204009.33016: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008041", "end": "2024-09-24 14:53:29.290431", "rc": 0, "start": "2024-09-24 14:53:29.282390" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:f5:d7:be:93 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.15.87/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 3258sec preferred_lft 3258sec inet6 fe80::8ff:f5ff:fed7:be93/64 scope link valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.87 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.87 metric 100 IP -6 ROUTE ::1 dev lo proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 7491 1727204009.33088: no more pending results, returning what we have 7491 1727204009.33092: results queue empty 7491 1727204009.33093: checking for any_errors_fatal 7491 1727204009.33094: done checking for any_errors_fatal 7491 1727204009.33095: checking for max_fail_percentage 7491 1727204009.33096: done checking for max_fail_percentage 7491 1727204009.33097: checking to see if all hosts have failed and the running result is not ok 7491 1727204009.33098: done checking to see if all hosts have failed 7491 1727204009.33099: getting the remaining hosts for this loop 7491 1727204009.33101: done getting the remaining hosts for this loop 7491 1727204009.33105: getting the next task for host managed-node3 7491 1727204009.33110: done getting next task for host managed-node3 7491 1727204009.33112: ^ task is: TASK: Verify DNS and network connectivity 7491 1727204009.33114: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204009.33118: getting variables 7491 1727204009.33122: in VariableManager get_vars() 7491 1727204009.33181: Calling all_inventory to load vars for managed-node3 7491 1727204009.33184: Calling groups_inventory to load vars for managed-node3 7491 1727204009.33191: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204009.33202: Calling all_plugins_play to load vars for managed-node3 7491 1727204009.33205: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204009.33208: Calling groups_plugins_play to load vars for managed-node3 7491 1727204009.34172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204009.35104: done with get_vars() 7491 1727204009.35124: done getting variables 7491 1727204009.35169: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 14:53:29 -0400 (0:00:00.332) 0:00:51.275 ***** 7491 1727204009.35191: entering _queue_task() for managed-node3/shell 7491 1727204009.35416: worker is 1 (out of 1 available) 7491 1727204009.35430: exiting _queue_task() for managed-node3/shell 7491 1727204009.35443: done queuing things up, now waiting for results queue to drain 7491 1727204009.35444: waiting for pending results... 7491 1727204009.35638: running TaskExecutor() for managed-node3/TASK: Verify DNS and network connectivity 7491 1727204009.35703: in run() - task 0affcd87-79f5-0a4a-ad01-000000001d94 7491 1727204009.35714: variable 'ansible_search_path' from source: unknown 7491 1727204009.35717: variable 'ansible_search_path' from source: unknown 7491 1727204009.35746: calling self._execute() 7491 1727204009.35831: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204009.35835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204009.35844: variable 'omit' from source: magic vars 7491 1727204009.36126: variable 'ansible_distribution_major_version' from source: facts 7491 1727204009.36135: Evaluated conditional (ansible_distribution_major_version != '6'): True 7491 1727204009.36234: variable 'ansible_facts' from source: unknown 7491 1727204009.36713: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 7491 1727204009.36722: variable 'omit' from source: magic vars 7491 1727204009.36749: variable 'omit' from source: magic vars 7491 1727204009.36774: variable 'omit' from source: magic vars 7491 1727204009.36807: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7491 1727204009.36834: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7491 1727204009.36850: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7491 1727204009.36873: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204009.36879: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7491 1727204009.36903: variable 'inventory_hostname' from source: host vars for 'managed-node3' 7491 1727204009.36906: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204009.36908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204009.36981: Set connection var ansible_timeout to 10 7491 1727204009.36985: Set connection var ansible_pipelining to False 7491 1727204009.36988: Set connection var ansible_shell_type to sh 7491 1727204009.36994: Set connection var ansible_module_compression to ZIP_DEFLATED 7491 1727204009.37001: Set connection var ansible_shell_executable to /bin/sh 7491 1727204009.37005: Set connection var ansible_connection to ssh 7491 1727204009.37024: variable 'ansible_shell_executable' from source: unknown 7491 1727204009.37027: variable 'ansible_connection' from source: unknown 7491 1727204009.37030: variable 'ansible_module_compression' from source: unknown 7491 1727204009.37032: variable 'ansible_shell_type' from source: unknown 7491 1727204009.37034: variable 'ansible_shell_executable' from source: unknown 7491 1727204009.37036: variable 'ansible_host' from source: host vars for 'managed-node3' 7491 1727204009.37038: variable 'ansible_pipelining' from source: unknown 7491 1727204009.37040: variable 'ansible_timeout' from source: unknown 7491 1727204009.37045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 7491 1727204009.37146: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727204009.37155: variable 'omit' from source: magic vars 7491 1727204009.37160: starting attempt loop 7491 1727204009.37163: running the handler 7491 1727204009.37173: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7491 1727204009.37190: _low_level_execute_command(): starting 7491 1727204009.37200: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7491 1727204009.37725: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204009.37733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204009.37786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727204009.37789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204009.37792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204009.37838: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204009.37849: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204009.37908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204009.39423: stdout chunk (state=3): >>>/root <<< 7491 1727204009.39518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204009.39580: stderr chunk (state=3): >>><<< 7491 1727204009.39587: stdout chunk (state=3): >>><<< 7491 1727204009.39608: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204009.39621: _low_level_execute_command(): starting 7491 1727204009.39625: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204009.396084-9719-65708295828413 `" && echo ansible-tmp-1727204009.396084-9719-65708295828413="` echo /root/.ansible/tmp/ansible-tmp-1727204009.396084-9719-65708295828413 `" ) && sleep 0' 7491 1727204009.40086: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204009.40098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204009.40123: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204009.40139: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204009.40142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204009.40181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204009.40188: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204009.40247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204009.42047: stdout chunk (state=3): >>>ansible-tmp-1727204009.396084-9719-65708295828413=/root/.ansible/tmp/ansible-tmp-1727204009.396084-9719-65708295828413 <<< 7491 1727204009.42162: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204009.42215: stderr chunk (state=3): >>><<< 7491 1727204009.42218: stdout chunk (state=3): >>><<< 7491 1727204009.42237: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204009.396084-9719-65708295828413=/root/.ansible/tmp/ansible-tmp-1727204009.396084-9719-65708295828413 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204009.42265: variable 'ansible_module_compression' from source: unknown 7491 1727204009.42311: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-749106ks271n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7491 1727204009.42344: variable 'ansible_facts' from source: unknown 7491 1727204009.42408: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204009.396084-9719-65708295828413/AnsiballZ_command.py 7491 1727204009.42524: Sending initial data 7491 1727204009.42528: Sent initial data (152 bytes) 7491 1727204009.43219: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204009.43223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204009.43261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7491 1727204009.43268: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204009.43322: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204009.43325: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204009.43328: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204009.43375: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204009.45021: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7491 1727204009.45050: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7491 1727204009.45092: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-749106ks271n/tmpykxaj79k /root/.ansible/tmp/ansible-tmp-1727204009.396084-9719-65708295828413/AnsiballZ_command.py <<< 7491 1727204009.45130: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7491 1727204009.45916: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204009.46029: stderr chunk (state=3): >>><<< 7491 1727204009.46032: stdout chunk (state=3): >>><<< 7491 1727204009.46050: done transferring module to remote 7491 1727204009.46060: _low_level_execute_command(): starting 7491 1727204009.46066: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204009.396084-9719-65708295828413/ /root/.ansible/tmp/ansible-tmp-1727204009.396084-9719-65708295828413/AnsiballZ_command.py && sleep 0' 7491 1727204009.46519: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204009.46528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204009.46573: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204009.46577: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204009.46579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204009.46633: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7491 1727204009.46643: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204009.46695: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204009.48352: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204009.48408: stderr chunk (state=3): >>><<< 7491 1727204009.48411: stdout chunk (state=3): >>><<< 7491 1727204009.48430: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204009.48434: _low_level_execute_command(): starting 7491 1727204009.48438: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204009.396084-9719-65708295828413/AnsiballZ_command.py && sleep 0' 7491 1727204009.48897: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204009.48901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204009.48931: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 7491 1727204009.48939: stderr chunk (state=3): >>>debug2: match not found <<< 7491 1727204009.48950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204009.48967: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 7491 1727204009.48973: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7491 1727204009.48979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204009.48985: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727204009.48994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204009.49043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204009.49059: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204009.49107: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204009.70494: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 9838 0 --:--:-- --:--:-- --:--:-- 9838\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 11640 0 --:--:-- --:--:-- --:--:-- 11640", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 14:53:29.620743", "end": "2024-09-24 14:53:29.703954", "delta": "0:00:00.083211", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7491 1727204009.71722: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 7491 1727204009.71788: stderr chunk (state=3): >>><<< 7491 1727204009.71793: stdout chunk (state=3): >>><<< 7491 1727204009.71815: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 9838 0 --:--:-- --:--:-- --:--:-- 9838\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 11640 0 --:--:-- --:--:-- --:--:-- 11640", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 14:53:29.620743", "end": "2024-09-24 14:53:29.703954", "delta": "0:00:00.083211", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 7491 1727204009.71856: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204009.396084-9719-65708295828413/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7491 1727204009.71866: _low_level_execute_command(): starting 7491 1727204009.71871: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204009.396084-9719-65708295828413/ > /dev/null 2>&1 && sleep 0' 7491 1727204009.72338: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7491 1727204009.72350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7491 1727204009.72386: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 7491 1727204009.72400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 7491 1727204009.72411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7491 1727204009.72461: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7491 1727204009.72470: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7491 1727204009.72528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7491 1727204009.74255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7491 1727204009.74312: stderr chunk (state=3): >>><<< 7491 1727204009.74317: stdout chunk (state=3): >>><<< 7491 1727204009.74332: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7491 1727204009.74338: handler run complete 7491 1727204009.74357: Evaluated conditional (False): False 7491 1727204009.74366: attempt loop complete, returning result 7491 1727204009.74369: _execute() done 7491 1727204009.74372: dumping result to json 7491 1727204009.74379: done dumping result, returning 7491 1727204009.74387: done running TaskExecutor() for managed-node3/TASK: Verify DNS and network connectivity [0affcd87-79f5-0a4a-ad01-000000001d94] 7491 1727204009.74393: sending task result for task 0affcd87-79f5-0a4a-ad01-000000001d94 7491 1727204009.74505: done sending task result for task 0affcd87-79f5-0a4a-ad01-000000001d94 7491 1727204009.74509: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.083211", "end": "2024-09-24 14:53:29.703954", "rc": 0, "start": "2024-09-24 14:53:29.620743" } STDOUT: CHECK DNS AND CONNECTIVITY 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 9838 0 --:--:-- --:--:-- --:--:-- 9838 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 11640 0 --:--:-- --:--:-- --:--:-- 11640 7491 1727204009.74583: no more pending results, returning what we have 7491 1727204009.74586: results queue empty 7491 1727204009.74587: checking for any_errors_fatal 7491 1727204009.74600: done checking for any_errors_fatal 7491 1727204009.74601: checking for max_fail_percentage 7491 1727204009.74602: done checking for max_fail_percentage 7491 1727204009.74603: checking to see if all hosts have failed and the running result is not ok 7491 1727204009.74604: done checking to see if all hosts have failed 7491 1727204009.74605: getting the remaining hosts for this loop 7491 1727204009.74606: done getting the remaining hosts for this loop 7491 1727204009.74610: getting the next task for host managed-node3 7491 1727204009.74626: done getting next task for host managed-node3 7491 1727204009.74629: ^ task is: TASK: meta (flush_handlers) 7491 1727204009.74631: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204009.74636: getting variables 7491 1727204009.74638: in VariableManager get_vars() 7491 1727204009.74690: Calling all_inventory to load vars for managed-node3 7491 1727204009.74693: Calling groups_inventory to load vars for managed-node3 7491 1727204009.74695: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204009.74706: Calling all_plugins_play to load vars for managed-node3 7491 1727204009.74708: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204009.74710: Calling groups_plugins_play to load vars for managed-node3 7491 1727204009.75577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204009.76533: done with get_vars() 7491 1727204009.76554: done getting variables 7491 1727204009.76608: in VariableManager get_vars() 7491 1727204009.76625: Calling all_inventory to load vars for managed-node3 7491 1727204009.76627: Calling groups_inventory to load vars for managed-node3 7491 1727204009.76628: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204009.76632: Calling all_plugins_play to load vars for managed-node3 7491 1727204009.76634: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204009.76636: Calling groups_plugins_play to load vars for managed-node3 7491 1727204009.77414: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204009.78353: done with get_vars() 7491 1727204009.78378: done queuing things up, now waiting for results queue to drain 7491 1727204009.78380: results queue empty 7491 1727204009.78381: checking for any_errors_fatal 7491 1727204009.78383: done checking for any_errors_fatal 7491 1727204009.78384: checking for max_fail_percentage 7491 1727204009.78384: done checking for max_fail_percentage 7491 1727204009.78385: checking to see if all hosts have failed and the running result is not ok 7491 1727204009.78386: done checking to see if all hosts have failed 7491 1727204009.78386: getting the remaining hosts for this loop 7491 1727204009.78387: done getting the remaining hosts for this loop 7491 1727204009.78389: getting the next task for host managed-node3 7491 1727204009.78392: done getting next task for host managed-node3 7491 1727204009.78394: ^ task is: TASK: meta (flush_handlers) 7491 1727204009.78395: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204009.78396: getting variables 7491 1727204009.78397: in VariableManager get_vars() 7491 1727204009.78412: Calling all_inventory to load vars for managed-node3 7491 1727204009.78413: Calling groups_inventory to load vars for managed-node3 7491 1727204009.78414: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204009.78418: Calling all_plugins_play to load vars for managed-node3 7491 1727204009.78422: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204009.78424: Calling groups_plugins_play to load vars for managed-node3 7491 1727204009.79110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204009.80069: done with get_vars() 7491 1727204009.80084: done getting variables 7491 1727204009.80125: in VariableManager get_vars() 7491 1727204009.80138: Calling all_inventory to load vars for managed-node3 7491 1727204009.80140: Calling groups_inventory to load vars for managed-node3 7491 1727204009.80141: Calling all_plugins_inventory to load vars for managed-node3 7491 1727204009.80145: Calling all_plugins_play to load vars for managed-node3 7491 1727204009.80147: Calling groups_plugins_inventory to load vars for managed-node3 7491 1727204009.80148: Calling groups_plugins_play to load vars for managed-node3 7491 1727204009.80832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7491 1727204009.81762: done with get_vars() 7491 1727204009.81786: done queuing things up, now waiting for results queue to drain 7491 1727204009.81787: results queue empty 7491 1727204009.81788: checking for any_errors_fatal 7491 1727204009.81789: done checking for any_errors_fatal 7491 1727204009.81789: checking for max_fail_percentage 7491 1727204009.81790: done checking for max_fail_percentage 7491 1727204009.81790: checking to see if all hosts have failed and the running result is not ok 7491 1727204009.81791: done checking to see if all hosts have failed 7491 1727204009.81792: getting the remaining hosts for this loop 7491 1727204009.81792: done getting the remaining hosts for this loop 7491 1727204009.81794: getting the next task for host managed-node3 7491 1727204009.81797: done getting next task for host managed-node3 7491 1727204009.81797: ^ task is: None 7491 1727204009.81798: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7491 1727204009.81799: done queuing things up, now waiting for results queue to drain 7491 1727204009.81799: results queue empty 7491 1727204009.81800: checking for any_errors_fatal 7491 1727204009.81800: done checking for any_errors_fatal 7491 1727204009.81801: checking for max_fail_percentage 7491 1727204009.81801: done checking for max_fail_percentage 7491 1727204009.81802: checking to see if all hosts have failed and the running result is not ok 7491 1727204009.81802: done checking to see if all hosts have failed 7491 1727204009.81804: getting the next task for host managed-node3 7491 1727204009.81806: done getting next task for host managed-node3 7491 1727204009.81806: ^ task is: None 7491 1727204009.81807: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node3 : ok=128 changed=4 unreachable=0 failed=0 skipped=118 rescued=0 ignored=0 Tuesday 24 September 2024 14:53:29 -0400 (0:00:00.467) 0:00:51.742 ***** =============================================================================== Install iproute --------------------------------------------------------- 3.89s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Check which services are running ---- 1.66s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.60s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.60s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.52s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.45s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_auto_gateway_nm.yml:6 Install iproute --------------------------------------------------------- 1.33s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Install iproute --------------------------------------------------------- 1.23s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Install iproute --------------------------------------------------------- 1.22s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Check which packages are installed --- 1.20s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.04s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gathering Facts --------------------------------------------------------- 0.99s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:3 Create veth interface veth0 --------------------------------------------- 0.98s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Create veth interface veth0 --------------------------------------------- 0.96s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Check which packages are installed --- 0.93s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 0.87s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.81s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Check which packages are installed --- 0.81s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Check if system is ostree ----------------------------------------------- 0.79s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Gather the minimum subset of ansible_facts required by the network role test --- 0.64s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 7491 1727204009.81960: RUNNING CLEANUP